Quantinuum is an integrated software-hardware quantum computing company that uses trapped-ion for its compute technology. It recently released a significant update to its Lambeq open-source Python library and toolkit, named after mathematician Joachim Lambek. Lambeq (spelled with a Q for quantum) is the first and only toolkit that converts sentences into quantum circuits using sentence meaning and structure to determine quantum entanglement.
Cambridge Quantum initially developed the toolkit before its merger with Honeywell Quantum Solutions forming a new company named Quantinuum. Within the merged company, Cambridge Quantum acts as its quantum software arm.
According to Ilyas Khan, CEO of Quantinuum, Cambridge Quantum is still marketed under its brand because it has a large customer base and significant business and technical relationships within the industry.
Why Quantum Natural Language Processing is important
Natural Language Processing (NLP) is a form of artificial intelligence that allows computers to understand words and sentences. All industry segments heavily utilize NLP, with usage projected to grow annually by over 27% in the next five years.
While NLP is powerful, Quantum Natural Language Processing (QNLP) promises to be even more powerful than NLP by converting language into coded circuits that can run on quantum computers.
The application of QNLP to artificial intelligence can significantly improve it. A large amount of data is required to train AI models, and quantum computing will dramatically speed up the training process, possibly reducing months of training to mere hours or minutes.
MORE FOR YOU
Current NLP language models built with transformer models and deep neural networks consume considerable energy creating environmental concerns. Within the decade, quantum computers will scale from hundreds to millions of qubits, enabling expanded versions of QNLP that will be more efficient, faster, handle enormous datasets, and require less power with a lower environmental impact.
The new Quantinuum Lambeq QNLP release has several key improvements:
- Lambeq’s training package supports popular supervised learning libraries such as PyTorch to help users efficiently train NLP tasks using generated quantum circuits and tensor networks
- The toolkit can now create more quantum circuits
- It will be easier to define quantum circuits using sentence structure by using context-free (syntax) diagrams
- The toolkit has improved the visualization of its output
- Expanded documentation has numerous examples for general users to follow
- A new command-line interface is now available, making most of the toolkit’s functionality available to users who have little or no programming knowledge
- A new supervised training module simplifies training parameterized quantum circuits and tensor networks for machine learning
In addition to the above improvements, Lambeq has a new neural-based parser named Bobcat. Parsers determine the meaning of a sentence by breaking it down into its parts. Humans, rather than computers, performed the training for Bobcat annotating word datasets and information sources. As a benefit for the community, Bobcat will also be released as a separate stand-alone open-source tool sometime in the future.
Since its introduction, researchers have used the toolkit to advance practical, real-world QNLP applications such as automated dialogue, text mining, language translation, text-to-speech, language generation, and bioinformatics.
How Quantum Natural Language Processing works
Converting a sentence into quantum circuits is complicated. However, QNLP has the advantage of being ‘quantum native,’ which means language has a similar compositional structure mathematically as the structure used in quantum systems. Also, Lambeq has a modular design that enables users to swap components in and out of the model, which provides flexibility in architecture design. Here is a simplified version of how the QNLP process works:
- QNLP converts a sentence into a logical format (syntax tree) so a computer can understand it.
- The software organizes the syntax tree into parts of speech by using mathematical linguistics to differentiate between verbs, nouns, prepositions, and adjectives.
- Parts of the sentence are then labeled according to the relationships between words.
- The sentence is converted into a string diagram, much like sentence diagrams you learned in elementary school.
- After being encoded, tensor networks or quantum circuits implemented with TKET are ready to be optimized for machine learning tasks such as text classification.
It is important to note that two out of three true QNLP pioneers now work together as senior researchers at Quantinuum. Professor Stephen Clark is Quantinuum’s Head of Artificial Intelligence, and Professor Bob Coecke is its Chief Scientist.
The foundation for QNLP is the DisCoCat framework (categorical compositional distributional model) developed by Bob Coecke, Stephen Clark, and Mehrnoosh Sarzadeh in 2010. Once published, DisCoCat quickly became the gold standard because it was unique to all other methods by combining both meaning and grammar in a single language model.
Patrick Moorhead, Moor Insights & Strategy founder, CEO, and Chief Analyst, recently had an interesting two-part discussion about QNLP, quantum, and AI with Bob Coecke, Chief Scientist at Quantinuum. The videos can be accessed on the Moor Insights and Strategy YouTube channel. Part one can be seen here and part two can be watched here.
Many important NLP applications are beyond the capability of classical computers. As QNLP and quantum computers continue to improve and scale, many practical commercial quantum applications will emerge along the way. Considering the expertise and experience of Professor Clark and Professor Coecke, plus a collective body of their QNLP research, Quantinuum has a clear strategic advantage in current and future QNLP applications.
- In last year’s announcement, the toolkit was called “lambeq.” Research papers refer to the toolkit in the same manner. However, Quantinuum’s current announcement refers to the toolkit as λambeq. To simplify, I refer to the toolkit as “Lambeq” in this article.
- QNLP research is still in the experimental stage. It will be several years before it has advanced enough to deploy in a large production environment. QNLP is mainly unexplored, and Lambeq has opened the door for researchers to use and advance the technology for broader, larger, and more unique applications.
- QNLP limitations result from the scaling limitations of today’s NISQ machines. But despite those limitations, early QNLP research is essential to help us understand and advance the science of both QNLP and quantum computing. Eventually, there will be significant real-world uses of QNLP.
- While it is true that Lambeq is the first toolkit for QNLP, there are other toolkits created for generic QML development.
- Lambeq is available as a conventional Python repository on GitHub and is available here:
- More details about the new release can be found here.
- The documentation and tutorials can be found here.
Follow Paul Smith-Goodson on Twitter for current information on quantum and AI
Note: Moor Insights & Strategy writers and editors may have contributed to this article.
Moor Insights & Strategy, like all research and tech industry analyst firms, provides or has provided paid services to technology companies. These services include research, analysis, advising, consulting, benchmarking, acquisition matchmaking, or speaking sponsorships. The company has had or currently has paid business relationships with 8×8, A10 Networks, Advanced Micro Devices, Amazon, Ambient Scientific, Anuta Networks, Applied Micro, Apstra, Arm, Aruba Networks (now HPE), AT&T, AWS, A-10 Strategies, Bitfusion, Blaize, Box, Broadcom, Calix, Cisco Systems, Clear Software, Cloudera, Clumio, Cognitive Systems, CompuCom, CyberArk, Dell, Dell EMC, Dell Technologies, Diablo Technologies, Dialogue Group, Digital Optics, Dreamium Labs, Echelon, Ericsson, Extreme Networks, Flex, Foxconn, Frame (now VMware), Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries, Revolve (now Google), Google Cloud, Graphcore, Groq, Hiregenics, HP Inc., Hewlett Packard Enterprise, Honeywell, Huawei Technologies, IBM, IonVR, Inseego, Infosys, Infiot, Intel, Interdigital, Jabil Circuit, Konica Minolta, Lattice Semiconductor, Lenovo, Linux Foundation, Luminar, MapBox, Marvell Technology, Mavenir, Marseille Inc, Mayfair Equity, Meraki (Cisco), Mesophere, Microsoft, Mojo Networks, National Instruments, NetApp, Nightwatch, NOKIA (Alcatel-Lucent), Nortek, Novumind, NVIDIA, Nutanix, Nuvia (now Qualcomm), ON Semiconductor, ONUG, OpenStack Foundation, Oracle, Panasas, Peraso, Pexip, Pixelworks, Plume Design, Poly (formerly Plantronics), Portworx, Pure Storage, Qualcomm, Rackspace, Rambus, Rayvolt E-Bikes, Red Hat, Residio, Samsung Electronics, SAP, SAS, Scale Computing, Schneider Electric, Silver Peak (now Aruba-HPE), SONY Optical Storage, Springpath (now Cisco), Spirent, Splunk, Sprint (now T-Mobile), Stratus Technologies, Symantec, Synaptics, Syniverse, Synopsys, Tanium, TE Connectivity, TensTorrent, Tobii Technology, T-Mobile, Twitter, Unity Technologies, UiPath, Verizon Communications, Vidyo, VMware, Wave Computing, Wellsmith, Xilinx, Zayo, Zebra, Zededa, Zoho, and Zscaler. Moor Insights & Strategy founder, CEO, and Chief Analyst Patrick Moorhead is a personal investor in technology companies dMY Technology Group Inc. VI and Dreamium Labs.