The Next Big Breakthrough in AI Will Be Around Language
Most companies recognize that aggressive adoption of digital technologies is increasingly critical to being competitive. Research shows that the top 10% of early adopters of digital technologies have grown at twice the rate of the bottom 25%, and that they are using cloud systems - not legacy systems - to enable adoption, a trend they expect to accelerate among industry leaders over the coming five years. Many laggard and middle-of-the-pack companies, by comparison, are dramatically underestimating the cloud resources they will need in order to access, power, or train a new generation of intelligent applications presaged by breakthroughs like GPT-3, a state-of-the-art natural language processing (NLP) tool.
THE BIG BREAKTHROUGHS IN AI WILL BE ABOUT LANGUAGE
The 2010s produced breakthroughs in vision-enabled technologies, from accurate image searches on the web to computer vision systems for medical image analysis or for detecting defective parts in manufacturing and assembly. GPT3, developed by OpenAI, indicates that the 2020s will be about major advances in language-based AI tasks. Previous language processing models used hand-coded rules (for syntax and parsing), statistical techniques and increasingly over the last decade, artificial neural networks, to perform language processing. Artificial neural networks can learn from raw data, requiring far fewer routine data labeling or feature engineering. GPTs (generative pre-trained transformers) go much deeper, relying on a transformer - an attention mechanism that learns contextual relationships between words in a text. Researchers who were given access to GPT-3 via a private beta were able to induce it to produce short stories, songs, press releases, technical manuals, text in the style of particular writers, guitar tabs, and even computer code.
INSIGHT CENTER
Technology and Transformation
Examining the challenges and opportunities that lie ahead.
GPT-3 is far from perfect. Its numerous flaws include sometimes producing nonsense or biased responses, incorrectly answering trivial questions, and generating plausible but false content. Even one of the leaders at OpenAI cautioned against over-hyping GPT-3. All of this suggests that much work remains to be done, but the writing, so to speak, is on the wall: a new stage of AI is upon us.
GPT-3 is only one of many advanced transformers now emerging. Microsoft, Google, Alibaba, and Facebook are all working on their own versions. These tools are trained in the cloud and are accessible only through a cloud application programming interface (API). Companies that want to harness the power of next generation AI will shift their compute workloads from legacy to cloud-AI services like GPT-3.
NEXT-GEN APPS WILL ENABLE INNOVATION ACROSS THE ENTERPRISE
These cloud-AI services will enable the development of a new class of enterprise apps that are more creative (or "generative" - the "G" in GPT) than anything we've seen before. They will make the process of synthesizing words, intentions, and information in language cheaper, which will make many business activities more efficient and stimulate the innovation and growth we see with early adopters.
Analysis of more than 50 business-relevant proofs of concept (demos) of GPT-3 indicates that tomorrow's leading-edge business apps will fall into at least three broad creative categories, all linked to language understanding: writing, coding, and discipline-specific reasoning.
GPT-3's ability to write meaningful text based on a few simple prompts, or even a single sentence, can be uncanny. For instance, one of GPT-3's private beta testers used it to produce a convincing blog on the subject of bitcoin. Among the demos theyanalyzed, there were apps for developing new podcasts, generating email and ad campaigns, suggesting how to run board meetings, and intelligently answering questions that would befuddle earlier language systems.
Based on prompts from humans, GPT-3 can also code - writing instructions for computers or systems. It can even convert natural language to programming language. In a natural language (English, Spanish, German, etc.), you describe what you want the code to do - such as develop an internal or customer-facing website. GPT then writes the program.
The ability to think about content, procedures, and knowledge in a scientific or technical field suggests other potentially fertile applications of GPT-3. It can answer chemistry questions - in one demo, it correctly predicted five of six chemical combustion reactions. It can autoplot graphs based on verbal descriptions, taking much of the drudgery out of tasks like creating presentations. Another beta tester created a GPT-3 bot that enables people with no accounting skills to generate financial statements. Another application can answer a deliberately difficult medical question and discuss underlying biological mechanisms. The app was given a description of a 10-year-old boy's set of respiratory symptoms and was informed that he was diagnosed with an obstructive disease and given medication. Then it was asked what protein receptor the medication was likely to act on. The program correctly identified the receptor and explained that the boy had asthma and that it is typically treated with bronchodilators that act on that receptor.
This general reasoning potential across writing, coding, and science suggests that the use of cloud-powered transformers could become a meta-discipline, applicable across management sciences, data sciences, and physical and life sciences. Further, across non-technical jobs, cloud in combination with GPT3 will lower the barrier for scaling digital innovations. Non-technical staff will be able to use every day natural language rather than programming languages to build apps and solutions for customers.
REIMAGINED JOBS WILL INCREASE PRODUCTIVITY
In light of these coming changes, companies will not only need to rethink IT resources, but also human resources. They can begin by analyzing the bundles of tasks in current roles, uncovering specific tasks that the AI can augment, and unleashing technical and non-technical workers alike to innovate faster. Using the Occupational Information Network (O*NET), based on a U.S. government standard used to classify workers into occupational categories, we analyzed 73 job categories in 16 career clusters, and found that all clusters would be impacted by GPT-3. Digging into job categories, we found that 51 can be augmented or complemented by GPT-3 in at least one task, and 30 can use GPT-3 to complement two or more tasks.
Some tasks can be automated, but analysis shows the larger opportunity will be around augmenting and amplifying human productivity and ingenuity. For example, communications professionals will see the majority of their work tasks involving routine text generation automated, while more critical communications like ad copy and social media messages will be augmented by GPT-3's ability to help develop lines of thought. Company scientists might use GPT-3 to generate graphs that inform colleagues about the product development pipeline. Meanwhile, to augment basic research and experimentation, they could consult GPT-3 to distill the findings from a specific set of scientific papers. The possibilities across disciplines and industries are limited only by the imagination of your people.
Source: HBRNew Technologies
- Extended Reality
- Robotics Drones & Vehicle automation
- Unhackable Internet
- Big Breakthrough AI
- 3-D Metal Printing
- Artificial Embryos
- Sensing City
- Artificial Intelligence
- Dueling Neural Networks
- Babel-Fish Earbuds
- Zero-Carbon Natural Gas
- Perfect Online Privacy
- Genetic Fortune Telling
- Material's Quantum Leap
© Copyright 2018 careerguidancejpgandhi.com. All Rights Reserved.
Privacy Policy