Artificial IntelligenceNewswire

OpenAI Research Explores AI Models Calling External Models via Cloud

▼ Summary

OpenAI researchers are exploring AI models calling specialized external models via the cloud to enhance capabilities.
– This concept extends the current function calling feature, allowing AI models to delegate sub-tasks to expert models.
– A “routing” model may be used to determine and direct queries to the best-suited specialized models.
– Benefits include accessing best-in-class expertise, improving efficiency, and increasing flexibility.
– Challenges include latency, cost, reliability, security, and standardization, with the research still in its early stages.

OpenAI researchers are investigating a future where their AI models, like GPT-4, could significantly enhance their capabilities by calling upon other specialized AI models hosted in the cloud to handle specific parts of a complex task. This concept, extending ideas like OpenAI’s existing “function calling” feature, points towards a more modular and collaborative AI ecosystem.

From Function Calls to Model Calls

Currently, OpenAI models can use function calling to interact with external APIs and tools. The research being explored takes this a step further, envisioning a system where one AI model could identify a sub-task requiring specialized knowledge (e.g., advanced mathematics, specific coding languages, niche domain expertise) and delegate it by “calling” another AI model optimized for that exact purpose.

TechCrunch highlights internal OpenAI research discussing this possibility. The idea involves potentially using a “routing” model – an AI that determines which specialized model is best suited for a particular query or sub-task and then directs the request accordingly. This allows the primary model to orchestrate solutions by composing capabilities from various expert models.

Rationale: Composition Over Monoliths

The driving force behind this research is the potential to create more powerful and versatile AI systems without building ever-larger monolithic models that attempt to master everything. By leveraging specialized external models, a system could:

  • Access Best-in-Class Expertise: Utilize models specifically trained for optimal performance on narrow tasks.

  • Improve Efficiency: Potentially reduce the computational load on the primary model.

  • Increase Flexibility: Easily integrate new capabilities by connecting to newly developed specialized models.

This modular approach aligns with the broader trend towards developing AI agents capable of complex, multi-step reasoning and action sequences.

Research Hurdles Remain

While promising, researchers acknowledge significant challenges must be overcome to make this vision practical. These include:

  • Latency: Ensuring rapid communication between models is crucial for real-time applications.
  • Cost: Managing the expense of potentially multiple model calls for a single user query.
  • Reliability & Security: Guaranteeing that the external models are dependable and not malicious.
  • Standardization: Developing protocols for seamless inter-model communication.

This exploration remains firmly in the research phase, but it offers a glimpse into how OpenAI is thinking about the next generation of AI architecture – one built on collaboration and specialized expertise.

(Source: TechCrunch)

Topics

ai models calling external models 100% modular ai ecosystem 90% function calling feature 80% specialized knowledge delegation 80% routing model 70% efficiency flexibility 70% research challenges 60%

The Wiz

Wiz Consults, home of the Internet is led by "the twins", Wajdi & Karim, experienced professionals who are passionate about helping businesses succeed in the digital world. With over 20 years of experience in the industry, they specialize in digital publishing and marketing, and have a proven track record of delivering results for their clients.