Beyond LLMs: How SandboxAQ's Large Scale Quantitative Models Can Optimize Enterprise AI

Subscribe to our daily and weekly newsletters for the latest updates and exclusive content covering industry-leading AI. Learn more


And large language models (LLMs) and generative AI While enterprise AI conversations dominated last year, there are other ways businesses can benefit from AI.

One alternative is large numerical models (LQMs). These models are trained to optimize specific objectives and parameters specific to an industry or application, such as material properties or financial risk metrics. This is different from the language comprehension and generation tasks of LLMs. Among the leading proponents and commercial vendors of LQMs SandboxAQit announced today that it has raised $300 million in a new funding round. The company was and is originally a part of Alphabet turned out to be a separate business in 2022.

Funding is proof of a company's success and, more importantly, its future growth prospects, as it seeks to resolve. enterprise AI use cases. SandboxAQ has partnered with major consulting firms such as Accenture, Deloitte and EY to promote its enterprise solutions. A major advantage of LQMs is their ability to solve complex, domain-specific problems in fields where physics and numerical connections are important.

“This is all about building core products in our companies that use AI,” SandboxAQ CEO Jack Hydari told VentureBeat. "So if you want to create a drug, a diagnostic, a new material, or manage risk at a big bank, that's where numerical models shine."

Why LQM is important for enterprise AI

LQMs have different goals and operate differently than LLMs. Unlike LLMs that process textual data from the InternetLQM generates its data from mathematical equations and physical principles. The goal is to solve the quantitative problems that the company may face.

"We create data and get data from quantitative sources," Hydari explained.

This method allows progress in areas where traditional methods have stalled. For example, in battery development where lithium-ion technology has dominated for 45 years, LQM can simulate millions of possible chemical combinations without a physical prototype.

Similarly, in pharmaceutical development, where traditional methods have suffered a high rate of failure in clinical trials, LQMs can analyze molecular structures and interactions at the electron level. In financial services, meanwhile, LQM addresses the limitations of traditional modeling methods.

"Monte Carlo simulation is not enough to handle the complexity of structured instruments," Hidari said.

Monte Carlo simulation is a classic type of computational algorithm that uses random sampling to produce results. With the SandboxAQ LQM approach, a financial services firm can scale beyond what Monte Carlo simulation can handle. Some financial portfolios can be very complex with all the structured instruments and options, Hydari noted.

"If I have a portfolio and I want to know what's going on in that portfolio," Hidari said. "What I want to do is I want to build a 300 million to 500 million version of that portfolio, tweak it a little bit, and then I want to look at the risk."

How SandboxAQ is using LQMs to improve cybersecurity

Sandbox AQ LQM technology aims to enable enterprises to create new products, materials and solutions instead of optimizing existing processes.

Among the enterprise verticals where the company has innovated is cybersecurity. In 2023, the company released it for the first time Sandwich Cryptography Management Technology. The company has since expanded with its AQtive Guard enterprise solution.

The software can analyze enterprise files, applications, and network traffic to determine the encryption algorithms in use. This includes detecting the use of outdated or broken encryption algorithms such as MD5 and SHA-1. SandboxAQ feeds this information into a governance model that can alert the Chief Information Security Officer (CISO) and compliance teams of potential vulnerabilities.

And a LLM can be used for the same purposeLQM provides a different approach. LLMs are trained on vast, unstructured Internet data that may include information about encryption algorithms and vulnerabilities. In contrast, Sandbox AQ LQMs are built using objective, quantitative data about encryption algorithms, their properties, and known vulnerabilities. LQMs use this structured data to build models and knowledge graphs for specific cryptanalysis, rather than relying on general language understanding.

Moving forward, Sandbox AQ is also working on a future patching module that will automatically recommend and implement updates to the encryption in use.

Quantum computers or quantum dimensions without transformers

The original idea behind SandboxAQ was to combine AI techniques with quantum computing.

Hydari and his team realized early on that true quantum computers would not be powerful enough easily or in the short term. SandboxAQ uses quantum principles implemented through an advanced GPU infrastructure. Through the partnership, SandboxAQ has extended Nvidia's CUDA capabilities to run quantum techniques.

SandboxAQ also does not use transformers, which are the basis of almost all LLMs.

"The models we train are neural network models and knowledge graphs, but they're not transformers," Hidari said. "You can build from equations, but you can also have quantitative data from sensors or other sources and networks."

Although the LQM is different from the LLM, Hydari doesn't see it as a business or situation.

"Use the LLMs for what they're good at, then bring in the LQMs for what they're good at," he said.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *