BBO Discussion Forums: Google has beaten Go, what could it do to bridge? - BBO Discussion Forums

Jump to content

  • 6 Pages +
  • 1
  • 2
  • 3
  • Last »
  • You cannot start a new topic
  • You cannot reply to this topic

Google has beaten Go, what could it do to bridge?

#1 User is offline   the_dude 

  • PipPipPipPip
  • Group: Full Members
  • Posts: 224
  • Joined: 2009-November-12
  • Gender:Male
  • Location:Florida

Posted 2016-January-28, 10:34

Google's AlphaGo defeated one of the top players in the world 5 games to 0, and achieved a 99.8% winning rate against other Go programs....AlphaGo’s next challenge will be to play the top Go player in the world over the last decade, Lee Sedol. The match will take place this March in Seoul, South Korea.

Rather than trying to use pure computation to conquer Go's ridiculous search space, they used deep neural networks to learn from human games, then learn from games against itself.

What do you think Google could do with this technology if they decided to turn to bridge?

Some links:
http://googleresearc...game-of-go.html
http://www.nature.co...ature16961.html
If no one comes from the future to stop you from doing it then how bad a decision could it really be?
2

#2 User is offline   etha 

  • PipPipPipPip
  • Group: Full Members
  • Posts: 252
  • Joined: 2005-August-25

Posted 2016-January-28, 10:54

I think it would do the same. ai already at decent level just can't beat best players now.
0

#3 User is offline   barmar 

  • PipPipPipPipPipPipPipPipPipPipPipPip
  • Group: Admin
  • Posts: 21,415
  • Joined: 2004-August-21
  • Gender:Male

Posted 2016-January-28, 12:21

I've long thought that a neural network might be the best technology for computers to learn bridge. But you have to be careful with the training -- if the input data includes auctions from many different systems (e.g. both natural and strong club), it will probably confuse it horribly. Even a mix of SA and 2/1 would probably make it difficult to learn.

A few years ago Uday and I experimented with using our archives of years of BBO auctions to help GIB with its bidding -- before making a bid, it would search for similar hands and auctions, to see what the most common bids were. We just assumed that most auctions were using similar systems (mostly SAYC and 2/1), and the rest would be outliers that didn't hurt the statistics. But we weren't using this to try to teach it the basic principles of bidding, it was just being used as a sanity check: if its simulations said to make a particular bid, but the database search showed that fewer than 20% of humans chose that action, it was ruled out. But even with that limited scope, it was not very helpful -- there are so many combinations of hand types and auctions that the number of matches were generally too low for useful statistics, except for early in the auction, but those bidding rules are already pretty good.

#4 User is online   helene_t 

  • The Abbess
  • PipPipPipPipPipPipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 17,087
  • Joined: 2004-April-22
  • Gender:Female
  • Location:UK

Posted 2016-January-28, 16:33

What I think neural networks will be particular useful for is to make inference from the opponents' (and partner's) play. Sort of generalized restricted choice: "if he had a strong holding in declarer's second suit he might have lead a trump". So when selecting/weighting hands for the sims you devaluate hands with a strong holding in declarer's second suit.
The world would be such a happy place, if only everyone played Acol :) --- TramTicket
0

#5 User is offline   steve2005 

  • PipPipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 3,150
  • Joined: 2010-April-22
  • Gender:Male
  • Location:Hamilton, Canada
  • Interests:Bridge duh!

Posted 2016-January-28, 18:20

Chess is suited to computers. Go also but is more difficult.
The problem with bridge is you don't know where all the cards are. Whereas is chess and go you know where all the pieces are.
Bidding has improved considerably but still has a lot to be desired. When chess programs were starting to challenge masters, bridge programs were no better than beginners at bidding.
Sarcasm is a state of mind
0

#6 User is offline   dwar0123 

  • PipPipPipPipPip
  • Group: Full Members
  • Posts: 770
  • Joined: 2011-September-23
  • Gender:Male
  • Location:Bellevue, WA

Posted 2016-January-28, 20:06

View Poststeve2005, on 2016-January-28, 18:20, said:

Chess is suited to computers. Go also but is more difficult.
The problem with bridge is you don't know where all the cards are. Whereas is chess and go you know where all the pieces are.
Bidding has improved considerably but still has a lot to be desired. When chess programs were starting to challenge masters, bridge programs were no better than beginners at bidding.


We should start a campaign to get google to tackle bridge next, precisely because it is not well suited for computers to solve. The challenges bridges present, in incomplete knowledge, is an interesting and a next logical step to the progression of artificial intelligence.
0

#7 User is offline   phoenix214 

  • PipPipPipPip
  • Group: Full Members
  • Posts: 347
  • Joined: 2011-December-23
  • Gender:Male
  • Location:Riga
  • Interests:Bridge; Chess; Boardgames; Physics; Math; Problem solving; and anything that makes my brain thinking.

Posted 2016-January-28, 21:31

Somehow I think bridge is less challenging for AI than GO - although I (currently) do not have any expierance in the field. Maybe I should take that as a thesis for university :P
0

#8 User is offline   lycier 

  • PipPipPipPipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 7,612
  • Joined: 2009-September-28
  • Gender:Male
  • Location:China

Posted 2016-January-29, 06:58

19 years ago in 1997, Mr.kasparov who was a international chess master with ranked first in the world, played against computer Deep Blue,was most shocking news on the planet and that year the computer beat top intelligence of human beings for the first time.From then on the computer made more rapid breakthrough in the field of application. It was said that Deep Blue and its successors beat Mr Kasparov by using the "brute force" technique.In 2006, human beat the computer software for last time,then never win against computer up to now.
Yesterday the new focus again.Computer software has defeated the professional players in the game of Go.That's incredible, it happened too suddenly.
In general, for International chess, each step contains 35 possible changes with about 80 rounds in a chess match, but for the game of Go, every step might contain 250 possible changes with 150 rounds at least in a Go match.
Lee SeDol,South Korea Go player, he is a great player who won most champion titles in the world Go event in the recent 10 years.The match will be held this march in Seoul of Korea capital, the bonus is $1 million provided by Google,which shows strong self-confidence of Alpha Go team.
However among China, Japan and South Korea,whatever it is players or Fans, who believes computer has ability to win human top player? Never ! Does Google's Go programming own the magic of god?Even so, we would like to say Best Wishes to Alpha Go team.

0

#9 User is offline   hrothgar 

  • PipPipPipPipPipPipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 15,380
  • Joined: 2003-February-13
  • Gender:Male
  • Location:Natick, MA
  • Interests:Travel
    Cooking
    Brewing
    Hiking

Posted 2016-January-29, 07:10

I've never considered bridge to be a particularly complicated game for a computer to play.
I think that there are some complicated issues around disclosure, but nothing that seems particularly challenging.

Even the disclosure issues seem solvable, especially if we can require that the competitors provide the computer with a corpus of hands consistent with the bidding so far.
Note that this is something that is relatively easy for a computer to do, but hard for a human.
I think that the problems with having people play against computers are likely to be on the human side, rather than the computer side...

I suspect that the reason that there is no equivalent to Deep Blue is the (relative) insignificance of the the game.
The serious academic researchers prefer to focus on games that are more popular. (Chess, Poker, Go)
Alderaan delenda est
2

#10 User is offline   Zelandakh 

  • PipPipPipPipPipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 10,667
  • Joined: 2006-May-18
  • Gender:Not Telling

Posted 2016-January-29, 08:12

View Postlycier, on 2016-January-29, 06:58, said:

In 2006, human beat the computer software for last time,then never win against computer up to now.

There was certainly a win against a computer in 2008 due to exploiting a program bug but I cannot even remember the last time I heard a human vs computer being reported so it is difficult to say what other losses there might have been.
(-: Zel :-)
0

#11 User is offline   gwnn 

  • Csaba the Hutt
  • PipPipPipPipPipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 13,027
  • Joined: 2006-June-16
  • Gender:Male
  • Location:Göttingen, Germany
  • Interests:bye

Posted 2016-January-29, 08:17

There have been some funny games with human vs computer using various material odds. One of them was Nakamura vs Komodo where the machine had:
-no f7/f2 pawn
-an exchange deficit
-3 moves behind (white had to play 1 e4, 2 d4 3 Nf3 and had the move)

All of them were drawn except the computer won in the "3 moves behind." To me this is conclusive proof that humans can't challenge computers on a level playing field. Yea yea sampling problem. But any time I see any good player asked about this, they not only agree with this assessment, they also say it is really not close at all.

https://www.chess.co...nal-battle-1331
... and I can prove it with my usual, flawless logic.
      George Carlin
0

#12 User is offline   barmar 

  • PipPipPipPipPipPipPipPipPipPipPipPip
  • Group: Admin
  • Posts: 21,415
  • Joined: 2004-August-21
  • Gender:Male

Posted 2016-January-29, 09:50

View Poststeve2005, on 2016-January-28, 18:20, said:

The problem with bridge is you don't know where all the cards are. Whereas is chess and go you know where all the pieces are.

This is exactly why a neural network approach is likely to be more suitable than the techniques currently in use. Neural networks are good at detecting patterns automatically, while traditional programming requires the programmer to spell out everything precisely.

In Go, you know where all the pieces are, but that still wasn't good enough for programs to be as good as human experts. The problem is that there are still so many combinations of plays that it can't analyze it sufficiently. It's necessary to recognize patterns from experience and intuition. Describing all the patterns in a traditional program would also be overwhelming, but a neural network can learn them on its own.

#13 User is offline   barmar 

  • PipPipPipPipPipPipPipPipPipPipPipPip
  • Group: Admin
  • Posts: 21,415
  • Joined: 2004-August-21
  • Gender:Male

Posted 2016-January-29, 09:55

View Posthrothgar, on 2016-January-29, 07:10, said:

Even the disclosure issues seem solvable, especially if we can require that the competitors provide the computer with a corpus of hands consistent with the bidding so far.
Note that this is something that is relatively easy for a computer to do, but hard for a human.

A corpus of hands is not very helpful without telling the computer what features they have in common, and what distinguishes them from hands that would have made other bids.

This is where the neural network would excel. You'd feed it millions of hands and auctions, and it would learn on its own how the bids relate to the hands, and which features of the hand are important.

#14 User is offline   lycier 

  • PipPipPipPipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 7,612
  • Joined: 2009-September-28
  • Gender:Male
  • Location:China

Posted 2016-January-29, 14:33

Lee Se-Dol is looking forward to play against AlphaGo.
After determining the date of the challenge, Lee said it is his great pleasure for him to play against artificial intelligence, " Whatever the outcome will be, it would be very meaningful event in the history of Go.I heard that artificial intelligence is unexpectedly strong, but I have confidence to win this time at least."
Many readers of course support Lee. they think the biggest differences between artificial intelligence and human are every step of computer calculation is the best choice, and the layout of human is not necessarily best, but the human is able to set a trap. Of course,many readers strongly support AlphaGo, "Don't look down on artificial intelligence, AI owns super computing power, human can do?".
I will vote for Lee Se-Dol, I think it is impossible for AlphaGo to beat human at present, even AlphaGo have beaten European Go Champion, but compared to Go champions from China,Japan and South Korea, he's too weak.
2

#15 User is offline   lycier 

  • PipPipPipPipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 7,612
  • Joined: 2009-September-28
  • Gender:Male
  • Location:China

Posted 2016-January-29, 15:18

For our Oriental people, we never heard of Google AlphaGo in the past, very strange in fact.
0

#16 User is offline   neilkaz 

  • PipPipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 3,568
  • Joined: 2006-June-28
  • Gender:Male
  • Location:Barrington IL USA
  • Interests:Backgammon, Bridge, Hockey

Posted 2016-January-29, 17:48

I agree with Lycier. Lets see what happens when the computer plays a 9-dan, and especially a 9-dan who will have studied the computer's play.
0

#17 User is offline   rhm 

  • PipPipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 3,090
  • Joined: 2005-June-27

Posted 2016-February-02, 04:00

View Posthrothgar, on 2016-January-29, 07:10, said:

I've never considered bridge to be a particularly complicated game for a computer to play.
I think that there are some complicated issues around disclosure, but nothing that seems particularly challenging.

Even the disclosure issues seem solvable, especially if we can require that the competitors provide the computer with a corpus of hands consistent with the bidding so far.
Note that this is something that is relatively easy for a computer to do, but hard for a human.
I think that the problems with having people play against computers are likely to be on the human side, rather than the computer side...

I suspect that the reason that there is no equivalent to Deep Blue is the (relative) insignificance of the the game.
The serious academic researchers prefer to focus on games that are more popular. (Chess, Poker, Go)

I doubt that any of these claims or reasons are valid.
It is also a myth that there has been no serious academic effort.
The question, how challenging or complex a game is for the human mind, is not the decisive criteria whether we will see software beating a human or not.
Many seem to believe if only IBM, Google Apple or the like would provide some serious resources we would see such a Bridge computer tomorrow.
I doubt that.
No serious amount of resources would have put a man on the moon in the 19th century.
You need good ideas and a strategy before resources (money) can be put into a productive investment.
And even if these conditions are present there is no guarantee of success.
For example despite the efforts and money so far spent, we always seem to be 50 years away from the first commercial fusion reactor.

I believe compared to Chess or Go, the challenges are very different for putting Bridge logic into software.
To mention just two:
Bridge is a game of incomplete information, Chess and Go are not.
Bridge is a partnership game played against another partnership, neither Chess nor Go are.
This already provides a challenge in definition what a good Bridge program would be.
One, who plays with a clone of itself as partner? That is like identical twins playing Bridge together.
It is entirely possible that identical twins could be world class when playing together, but be only mediocre when playing with others.
They certainly would have a big advantage when playing in a partnership.
Not my definition of a great Bridge player. But this is the way Computer Bridge championships are played today.
A more suitable comparison would be two independently developed Bridge software programs playing against an expert partnership.
Two neuronal networks might be acceptable as partners, provided they were primed by independent deals and experience.

I am not saying we will not see eventually bridge computers being capable of beating experts, but there are good reasons why so far nobody came close.

Rainer Herrmann
1

#18 User is offline   gwnn 

  • Csaba the Hutt
  • PipPipPipPipPipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 13,027
  • Joined: 2006-June-16
  • Gender:Male
  • Location:Göttingen, Germany
  • Interests:bye

Posted 2016-February-02, 04:12

I wouldn't mind playing bridge where my partner would be myself (careful choice of words, avoiding innuendo). If anything, I'd identify my annoying habits (in bidding or play) and would be able to correct for them much more efficiently. It can be fun but after a while exhausting to convince my partners to try out some new treatment or to scrap some of their idiotic conventions. I think a lot of people would like to at least try playing in a partnership where the other member is a clone (hopefully I still managed to avoid most of the double entendres).
... and I can prove it with my usual, flawless logic.
      George Carlin
0

#19 User is offline   Fluffy 

  • World International Master without a clue
  • PipPipPipPipPipPipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 17,404
  • Joined: 2003-November-13
  • Gender:Male
  • Location:madrid

Posted 2016-February-02, 06:41

One of my projects for my webpage was "bid with yourself" where you bid hundreds of similar deals at the same time partnering yourself, with enough time space to hopefully forget about sequences. And maybe adding notes to yourself to be check when the bidding is over. But I am moving to other areas ATM.
0

#20 User is offline   psyck 

  • PipPipPipPip
  • Group: Yellows
  • Posts: 104
  • Joined: 2006-April-12
  • Gender:Male
  • Location:Bangalore
  • Interests:Bridge, Chess, Reading, Music, Travelling, Movies.

Posted 2016-February-02, 06:44

Large strides were made in Chess due to the presence of open source projects to which people from all over the world could contribute coding ideas, testing H/W, etc. The existence of some reverse engineered commercial S/W also helped ;) It is quite possible that initiating an open source Bridge project could result in Bridge programs that can play stronger than humans.
Cheers, Krishna.
_________________
Valiant were the efforts of the declarer // to thwart the wiles of the defender // however, as the cards lay // the contract had no play // except through the eyes of a kibitzer.

  • 6 Pages +
  • 1
  • 2
  • 3
  • Last »
  • You cannot start a new topic
  • You cannot reply to this topic

1 User(s) are reading this topic
0 members, 1 guests, 0 anonymous users