-
Notifications
You must be signed in to change notification settings - Fork 8
Leela is an opening book nonsense
This notion that neural networks are actually opening books in disguise keeps rearing its silly head. I decided to test this opening book hypothesis with one of my favorite nets. With 3 seconds per move on a RTX 2070, I investigated a few opening lines, taking the top choice, except in the case of the first or second move as noted. So, what openings does this net like?
It likes e4 and d4. On e4, it likes e6 and c5. On e6, it likes d4. Looking a few moves deeper, we get 1. e4 e6 2. d4 d5 3. Nd2 c5 4. Ngf3 cxd4 5. Nxd4 Nf6 -- pretty main line Tarrash French. If we go down the Sicilian route, we get 1. e4 c5 2. Nf3 d6 3. Bb5+ Bd7 4. Bxd7+ Qxd7 5. O-O Nc6 -- main line Rossolimo. And on d4, 1. d4 Nf6 2. Nf3 e6 3. c4 d5 4. Nc3 Be7 5. Bg5 O-O is pretty main line QGD.
Good evidence that this net has memorized opening moves? Not so fast. This is Little Ender, an endgame net that has never seen opening positions. In fact, it has never seen a position with more than 18 pieces.
My new (old) blog is at lczero.libertymedia.io