HomeAssetsUncovering the secrets of effective demand forecasting in the B2B industry

Uncovering the secrets of effective demand forecasting in the B2B industry

Related stories

Components of Successful B2B Campaigns

Creating a successful B2B campaign involves a thoughtful combination...

Targeting the Right Audience: Strategies for Effective Account-Based Marketing (ABM)

Account-Based Marketing (ABM) has emerged as a powerful strategy...

Google Cloud Introduces New Generative AI Tools for Retailers

Google's cloud business announced in a release recently that...

Yann LeCun, the chief scientist at Meta and a pioneer in deep learning, stated that he thinks it will take decades for current AI systems to resemble sentient beings with common sense and the capacity to do more than just creatively summarize massive amounts of text.

His viewpoint differs from that of Nvidia CEO Jensen Huang, who declared recently that AI will surpass humans in several mentally demanding tasks in less than five years, making it “fairly competitive” with humans.

“I am familiar with Jensen,” LeCun declared at a recent celebration honoring the Fundamental AI Research team’s tenth anniversary at the parent company of Facebook. LeCun claimed the CEO of Nvidia stands to gain a lot from the AI frenzy. “He’s providing the weapons in this AI war.”

Regarding researchers trying to create artificial general intelligence (AGI), or AI comparable to human intelligence, LeCun remarked, “If you think AGI is in, the more GPUs you have to buy.” Researchers at companies like OpenAI will require more Nvidia computer chips as long as they are pursuing AGI.

LeCun predicted that “dog-level” or “cat-level” AI will probably reach society years before human-level AI. The current emphasis of the technology sector on language models and text data will not be enough to develop the sophisticated AI systems that resemble humans that scientists have been envisioning for decades.

LeCun declared, “Text is a very poor source of information,” pointing out that the volume of text used to train current language models is probably too large for a human to read in 20,000 years.

LeCun added, “Train a system on the equivalent of 20,000 years of reading material, and they still don’t understand that if A is the same as B, then B is the same as A. There are a lot of really basic things about the world that they just don’t get through this kind of training”

Meta is currently not placing a significant wager on quantum computing, in contrast to Microsoft, Google, and other tech behemoths.


Latest stories