Given it’s about time in Nvidia’s preferred biannual cycle for a new graphics card generation, and it being the GeForce account rather than one of Nvidia’s non-gaming brands, and Nvidia also dropping the cryptic stuff and just straight-up confirming a “GeForce Beyond” show on the 20th, you can bet your sweet memory bus that September 20th will see some kind of RTX GPU announcement. The RTX 30 series, which has contributed most of the past two years’ best graphics cards, will finally begin its journey into semi-retirement. What exactly will be on show? That remains to be seen, though my money’s on the announcement focusing specifically on the top-of-the-line RTX 4090; as wobbly as leaks can be, word on the street is remarkably harmonious that this will eventually release ahead of any RTX 4080, which would in turn arrive before an RTX 4070. Will reckons the same. What we can be sure of is that this announcement concerns Nvidia’s next-gen graphics hardware, in whatever form it takes. One of the tweets includes a video that, after showing a heavily GeForce-branded desktop setup, zooms in on an image of the Diagram for the Computation of Bernoulli Numbers. This algorithm became the first to ever be number-crunched by a machine, but was originally created by pioneering mathematician Ada Lovelace. The poorly-kept secret codename for Nvidia’s next gaming GPU architecture, following Ampere and Turing? Lovelace. So it’s more or less certain to be a GPU-based event and not something to do with, say, GeForce Now. There’s also a conspicuous post-it note in the video that lists some numbers: 208, 629, and 7538. I’d only be speculating as to what these mean, though – maybe specs for the RTX 4090? Core counts or the like? Or maybe if you add up all three you get the amount, in pounds, that the RTX 4090 will cost from the average scalper. GeForce Beyond, by the by, will act as Nvidia CEO Jensen Huang’s keynote address to the company’s GTC event. Which is otherwise AI-focused, but then RTX graphics cards have always used AI/machine learning for features like DLSS, so that’s another point in the “Is probably about gaming GPUs” field.