The Tech Slice : Understanding AI-Nativity

September 23, 2024by Karim Rabie0

You might have already heard about the term “AI-Native.”

A nice adjective combined with a specific technology creates a buzzword that reflects the innovation and evolution of AI Technology in the corresponding domain.

AI-native Network, AI-Native Cloud, AI-Native application, etc.

See! It works perfectly.

You will start to see events with AI-Native titles, PRs titled “A specific enterprise successfully launched the first AI-native X,” and you will be surprised how the term evolves into a potential new xG Architecture.

The community’s understanding of the term is crucial, and this article aims to facilitate that by presenting the concept in simple terms and highlighting the different views.

So what’s being native to something?

Let’s take the language contest as an example.

I speak native Arabic. I was born in a house where all the inhabitants speak Arabic, so it was organic for me to learn my first Arabic words and expand my vocabulary over time. The fact that the surrounding community also speaks Arabic natively helped me to train and practice organically using Arabic as a primary means of communication.

What about the technology context?

If an app is native to a platform, it is designed to benefit from and inherit all platform characteristics and values, as clearly demonstrated in the cloud-native application function definitions.

If Cloud is giving you elasticity, then developing the app to support this elasticity is a part of being cloud-native, which comes with a whole set of advantages.

Now, let’s make the same analogy with the Telco Networks and run a brief As-Is discovery.

  • Was the AI born with these networks as part of a reference architecture?
  • Is the enterprise architecture set to guarantee the data exposure, collection, and preparation for AI-specific engines to process?
  • Do all Network elements speak, understand, use AI to perform their corresponding function?
  • Is developing the ML Models happening natively as part of the app delivery process?
  • Is it straightforward, business as usual, for an AI Engine to get data and learn from it from the different Telco domains and sub-domains?

The answer is No.

There is no nativity to AI with respect to the current Mobile Telco Networks Architecture and functions.

AI is commonly added as an enhancement to the existing stacks or a specific product that provides a particular value using Machine Learning.

 

As-Is State of AI/ML footprint in Telcos [Informative]

 

It is still the case that AI supports the Network, and we have yet to reach the point where the Network is the one that natively supports AI and stimulates the power of innovative use cases in different domains.

There are different views and directions, not commonly biased at this stage, but I’d pick three common points for a To-be State of AI-Native Networks.

Three common aspects for Telco Networks to transition into AI-Native Networks.

Next-generation applications and Network Functions are AI-native by design and architecture.

Intelligent apps should be developed and served in a joint LCM governance with ML Models. An informative example would be imagining that you have Intelligent UPF, Intelligent AMF, Intelligent CU, etc. that has native AI Capabilities, and not relying on a specific Node that provides ML Capabilities to the Network to take a specific action.

 

A view on the AI-nativity of the applications [Informative]

 

A clear Overlay AI/ML Reference Architecture that Interlocks with the existing Telco domains & architecture (Underlay)

The architecture ensures that the data collection, aggregation, preparation, and AI Model training, & serving are happening seamlessly with zero interruptions to the services. Think of an enterprise-level data mesh that connects to all data sources/sinks across all Telco network domains, with network functions and management systems that expose the right metrics.

FG-ML5G-ARC5G – ITU: Unified architecture for machine learning in 5G and future networks [Example of the standardization efforts]

 

A solid MLOps Framework

The Framework allows Telcos to govern and control the ML Models LCM, allowing them to build intelligent app factories using the above points.

So, what can you expect from a standardization perspective?

I don’t expect 6G or its actual deployment to achieve the To-be state of what we want to achieve because no one would welcome a dramatic architectural change, especially with the obstacles that 5G SA is facing.

Greenfield “Modern” Telcos (If I may use the term) and new market players, such as the Technology firms that started to provide private 5Gs and recently acquired mobile licenses, have a golden chance to pilot their vision and enforce a community norm that might influence the standards accordingly or even collide the standardization pace and be part of new parallel community consortiums that takes the technology forward.

My last words here and it might deviate little bit from the article topic is that we all have the chance to contribute and influence the directions. Let’s ensure that our advancements in Artificial Intelligence benefit humanity, rather than threaten it.

We are all accountable.

Leave a Reply

Your email address will not be published. Required fields are marked *

https://karimrabie.com/wp-content/uploads/2024/09/KR_Logo-2.png
https://karimrabie.com/wp-content/uploads/2024/09/KR_Logo-2.png

Copyright 2024 – Karim Rabie. All rights reserved.