Despite the beginnings of "AI fatigue" setting in as the technology continues to dominate the news cycle, and Google being ridiculed for the results its AI overview search supposedly outputs, there's no shortage of money being poured into artificial intelligence ventures.
The latest announcement comes via Elon Musk who loves the letter X because it stands for anything and everything. Musk was early out of the gates with AI, being one of the key people behind OpenAI; his x.ai company announced that it has raised US$6 billion in a second round of funding from private equity funds and Saudi Arabian investment vehicle Kingdom Holding, which as it happens, put money into Twitter (now X) as well.
Much of the money raised looks set to be spent on Nvidia graphics cards, which provide the accelerated performance needed to power AI. Musk announced he's looking at building a large computer system by the end of 2025, and it won't be cheap. The retail price for an Nvidia H100 card with 80 gigabytes of memory hovers around the US$30,000 mark. They're in hot demand so don't expect much wholesale discount on them either.
The current version of the Grok AI chatbot on Twitter-X uses 20,000 H100s, with the upcoming Grok 3 said to require 100,000 of the cards. With a planet-murdering power draw of up to 700 Watt per card, if you missed the share market rush on Nvidia, perhaps it's a good idea to work out where in the world AI data centres are being built and invest in electricity generators supplying them?
France meanwhile is going big on "IA" with President Macron announcing €400 million funding for nine centres of excellence throughout the country, on top of ordering research organisations to mobilise their resources for the technology. There will be AI in schools as well, to teach kids about the technology. What the eccentric-sounding proposal to create "Cafés IA" to acculturate the citizens of France ends up as will be curious to discover, but Macron's administration insists the tech is the future of the nation.
It might well be too, judging by investment activity at last. Well-known AI company Hugging Face that was founded by three French entrepreneurs, and which is popular with developers, is valued at somewhere over US$4 billion. Holistic AI which was set up by French researchers working for Google's Deepmind division picked up a cool US$220 million in funding, just months after kicking off. It's now called H and aims to develop an artificial general intelligence or AGI, which computer scientists say will be cleverer than humans.
It's important to bear in mind that investors will seek a decent return on the vast amounts of money they're sinking into AI. This puts enormous pressure on the funded companies to deploy the technology in as many places as possible, so as to start bringing in the necessary profits. If that doesn't pan out, the venture capitalists who put up the money will not be pleasant to deal with. This could push some less than ethical enterprises to implement AI not as a tool to make employees more productive, but to cut staff numbers to save costs.
Is it all bad for humans then, being outcompeted by data centres filled with servers loaded up with high-end video cards that excel at doing many maths tasks in parallel? Possibly not, or maybe possibly not until large language models for generative AI become an awful lot smarter. While just a few years ago AI evangelists would declare with total certainty that a white-collar worker wipeout was imminent, the perspective is not quite as swivel-eyed and wild now. You won't lose your job to AI, but to someone using AI apparently.
Maybe, but that notion needs to be tempered with AI's voracious hunger for fresh data to remain useful and topical. Here, human generated data seems to still command a premium. The weird and wonderful web forum that's Reddit is a case point, having sold its user data to Google for hundreds of millions of dollars, which investors who bought into the company's share market float really seem to like, pizza with glue recipes notwithstanding.
AI doesn't have to be limited to giant data centres however, and some of the more useful implementations are found in everyday devices like smartphones. Called Edge AI, there is neural network chips and other hardware everywhere, for applications such as image manipulation and voice recognition, automation and even some generative tasks. An AI generated and human edited article by bankers Morgan Stanley suggests Edge AI will be an upcoming investment honeypot in fact.
There's actually an opportunity coming up locally to learn more about what AI can do. In December, Auckland will host the five-day 2024 International Conference of Neural Information Processing. It looks like there will be many interesting showcases, workshops and sessions at ICONIP, which is supported by AUT and the University of Auckland among others and features researchers from around the world. Calendar the conference now, if you're interested in AI and how it might interact with humanity.
6 Comments
But it's not intelligence is it? It's just pattern matching, statistics, computing power under direction of some code.
Still got a way to go.
https://www.livescience.com/health/neuroscience/new-3d-map-charted-with…
https://www.tomshardware.com/tech-industry/full-scan-of-1-cubic-millime…
The cubic millimeter of brain matter is only one-millionth of the size of an adult human brain, and yet the imaging scans and full map of its intricacies comprises 1.4 petabytes, or 1.4 million gigabytes. If someone were to utilize the Google/Harvard approach to mapping an entire human brain today, the scans would fill up 1.6 zettabytes of storage.
Taking these logistics further, storing 1.6 zettabytes on the cheapest consumer hard drives (assuming $0.03 per GB) would cost a cool $48 billion, and that's without any redundancy. The $48 billion price tag does not factor in the cost of server hardware to put the drives in, networking, cooling, power, and a roof to put over this prospective data center. The roof in question will also have to be massive; assuming full server racks holding 1.8 PB, the array of racks needed to store the full imaging of a human brain would cover over 140 acres if smushed together as tightly as possible.
Just put some glue on your pizza, mix in gasoline with your spagetti and eat some rocks after all AI asserts that it truthfully improves your diet:
https://futurism.com/the-byte/googles-ai-glue-on-pizza-flaw
https://theconversation.com/eat-a-rock-a-day-put-glue-on-your-pizza-how…
Also lets take the advice that a backpack is as effective as a parachute when jumping from a plane with a grain of salt eh.
Don't forget to count & crosscheck the number of fingers on those pictures. Or that your rat genitals don't end up larger then the body itself.
Protip AI code generation is worse then 50% correct when it trained on the exact answers in the training data. Most other times it is worse then 20% accurate. Flipping a coin on any random answer you pull off stack overflow would actually be an improvement.
Tip to remove the fraudulent false information of AI overviews from your google search with instructions:
https://arstechnica.com/gadgets/2024/05/google-searchs-udm14-trick-lets…
Seriously I started doing this and it is a massive improvement on the search results, teamed up with the removal of fake temu products it is almost like a real search engine again. If I wanted satirical fake news I would actually read the Onion, and reddit. I don't need google's AI to regurgitate snippets of both and assert them as valid answers (especially when the real actual answers could be used).
We welcome your comments below. If you are not already registered, please register to comment.
Remember we welcome robust, respectful and insightful debate. We don't welcome abusive or defamatory comments and will de-register those repeatedly making such comments. Our current comment policy is here.