Gen-AI-Today

GenAI TODAY NEWS

Free eNews Subscription

LG's EXAONE 3.5: A New Standard in Generative AI

By Greg Tavarez

LG AI Research introduced EXAONE 3.0 in August 2024, which was thought of as a huge advancement in large language models. This third iteration focused on improved performance and efficiency, and boasted a smaller size and faster inference processing compared to its predecessor, EXAONE 2.0. 

EXAONE 3.0 was trained on a massive dataset of over 60 million specialized documents and covered domains such as patents, code, mathematics and chemistry. Notably, LG AI Research made a bold move by open-sourcing EXAONE 3.0, a first for a domestic company in Korea. And LG AI Research plans to further expand this dataset by the end of the year, as well.

Well, the holidays are around the corner, so the end of the year is here. As an early surprise, LG AI Research open-sourced its latest AI model, EXAONE 3.5, further enhancing its performance.

This move makes three distinct models accessible to the public: an ultra-lightweight model for on-device applications, a lightweight model for general-purpose use and a high-performance model for specialized tasks.

To enhance the accuracy and reliability of EXAONE 3.5, LG AI Research has incorporated advanced techniques such as Retrieval-Augmented Generation (RAG) and Multi-step Reasoning. RAG enables the model to generate answers based on real-time web search results or uploaded documents, while Multi-step Reasoning allows it to break down complex queries into smaller, more manageable steps to arrive at logical conclusions.

One of the standout features of EXAONE 3.5 is its ability to process lengthy texts, equivalent to approximately 100 pages at a time. This capability opens up new possibilities for various applications, from document summarization to in-depth analysis.

“As AI technology becomes a key strategic asset for each country, developing AI models with our own technology is meaningful in contributing to enhancing national AI competitiveness,” said an official from LG AI Research.

LG AI Research publicly released a technical report detailing the performance evaluation of EXAONE 3.5 across various benchmarks. The results demonstrate the model's exceptional performance in real-world usability, long text processing, coding, and mathematical tasks. By sharing these insights, LG AI Research has a goal to foster transparency and encourage further research in the field.

Additionally, LG employees can sign up and start using ChatEXAONE in their work immediately. ChatEXAONE applies information encryption and privacy protection technology so that employees can use it in their work without worrying about leaking internal data within the company's security environment.

LG AI Research expects ChatEXAONE to help employees increase their work productivity and efficiency, from real-time web information search to document summarization, translation, report writing, data analysis and coding.

With the application of EXAONE 3.5 to ChatEXAONE, LG AI Research enhanced its performance and added “Deep” and “Dive” functions.

"Deep” allows ChatEXAONE to analyze and infer multiple questions in stages and provide a comprehensive answer when a complex question is asked, and can be used when accurate and in-depth report-level results are wanted.

"Dive” is a feature that allows users to select a search scope such as general, global, academic and YouTube to get answers based on the exact source according to their purpose.

ChatEXAONE recommends 133 job-specific prompts based 14 different job functions and provides personalized answers, and employees can set their interests according to their use.

And LG AI Research does not plan to stop with EXAONE 3.5. They have researched on Large Action Model that plans and acts on its own and plans to develop an AI agent based on the technology in 2025. So, we will see how this develops.

“The recent advancement of generative AI models has accelerated, and it is important to upgrade them quickly,” said Bae Kyunghoon, the president of LG AI Research. “We will speed up the pace of innovation and develop them into frontier models that represent Korea, with the goal of artificial super intelligence that can be applied to real-world industries.”

Be part of the discussion about the latest trends and developments in the Generative AI space at Generative AI Expo, taking place February 11-13, 2025, in Fort Lauderdale, Florida. Generative AI Expo covers the evolution of GenAI and will feature conversations focused on the potential for GenAI across industries and how the technology is already being used to create new opportunities for businesses to improve operations, enhance customer experiences, and create new growth opportunities.




Edited by Alex Passett
Get stories like this delivered straight to your inbox. [Free eNews Subscription]

GenAIToday Editor

SHARE THIS ARTICLE
Related Articles

Building Personalized AI Agents

By: Special Guest    4/4/2025

It's tempting to build an AI Agent that can do everything, but that's a recipe for a diluted and, ultimately, less effective generic workflow.

Read More

Salad Redefines AI Transcription with Unmatched Accuracy and Ultra-Low Pricing

By: Erik Linask    3/31/2025

Salad looks to upend the AI transcription market with its low-cost, highly accurate artificial intelligence-driven Salad Transcription API.

Read More

The Human-AI Partnership: Elevating Customer Service Without Losing the Personal Touch

By: Special Guest    3/26/2025

How businesses can leverage AI to improve customer experiences without losing the human touch of customer interactions.

Read More

Boomi AI Studio Launched to Centralize Control and Governance of Enterprise AI Agents

By: Erik Linask    3/10/2025

Boomi AI Studio allows businesses to harness the power of AI-driven automation by delivering the necessary oversight and guardrails to enable scaling …

Read More

IBM Strengthens GenAI Portfolio with DataStax Acquisition

By: Erik Linask    2/25/2025

Bolstering its Generative AI portfolio, IBM announced its plan to acquire AI and data solutions provider DataStax.

Read More

-->