FYI the link requires login because it’s for edit mode. Might be good to also have a “What is Ibis?” bit here, instead of requiring people to follow the link.
At any rate, looks neat! Has there been any thought given to what happens if the Conservapedia or similar people want to get onto the network? Is it instance blocking like Lemmy?
The whole “it’s just autocomplete” is just a comforting mantra. A sufficiently advanced autocomplete is indistinguishable from intelligence. LLMs provably have a world model, just like humans do. They build that model by experiencing the universe via the medium of human-generated text, which is much more limited than human sensory input, but has allowed for some very surprising behavior already.
We’re not seeing diminishing returns yet, and in fact we’re going to see some interesting stuff happen as we start hooking up sensors and cameras as direct input, instead of these models building their world model indirectly through purely text. Let’s see what happens in 5 years or so before saying that there’s any diminishing returns.
Gary Marcus should be disregarded because he’s emotionally invested in The Bitter Lesson being wrong. He really wants LLMs to not be as good as they already are. He’ll find some interesting research about “here’s a limitation that we found” and turn that into “LLMS BTFO IT’S SO OVER”.
The research is interesting for helping improve LLMs, but that’s the extent of it. I would not be worried about the limitations the paper found for a number of reasons:
o1-mini
and llama3-8B
, which are much smaller models with much more limited capabilities. GPT-4o got the problem correct when I tested it, without any special prompting techniques or anything)Until we hit a wall and really can’t find a way around it for several years, this sort of research falls into the “huh, interesting” territory for anybody that isn’t a researcher.
Gary Marcus is an AI crank and should be disregarded
They laughed at Columbus, they laughed at Fulton, they laughed at the Wright brothers. But they also laughed at Bozo the Clown.
⅐ = 0.1̅4̅2̅8̅5̅7̅
The above is 42857 * 7, but you also get interesting numbers for other subsets:
7 * 7 = 49 57 * 7 = 399 857 * 7 = 5999 2857 * 7 = 19999 42857 * 7 = 299999 142857 * 7 = 999999
Related to cyclic numbers: