Visualização normal

Antes de ontemStream principal
  • ✇Security | CIO
  • “제조·국방 현장, 범용 AI로는 부족”… 윤성호 마키나락스 대표, 산업 특화 AI 전략 공개
    윤성호 마키나락스 대표는 6일 열린 기자간담회에서 “피지컬 AI 시대는 이미 시작됐지만, 가장 먼저 현실화되는 곳은 휴머노이드가 아니라 제조 산업 현장과 전투 현장”이라며 “산업 현장은 일반적인 클라우드 환경과 달리 정밀도·신뢰성·보안에 대한 요구 수준이 높아 범용 AI만으로는 대응에 한계가 있다”고 말했다. 마키나락스는 산업 현장에서 AI 도입이 어려운 핵심 이유로 폐쇄망 환경과 현장 데이터 문제를 꼽았다. 제조 공장과 국방 시설은 외부 데이터 반출이 제한되는 경우가 많아 일반적인 클라우드 기반 AI 서비스를 그대로 적용하기 어렵고, 산업 장비마다 데이터 구조와 운영 방식이 달라 실제 현장 데이터를 충분히 학습하지 않으면 기업이 원하는 수준의 정확도를 구현하기 어렵다는 설명이다. 여기에 자동차 생산라인처럼 다양한 제조사의 로봇이 함께 운영되는 환경에서는 특정 제조사 솔루션만으로 통합 관리가 쉽지 않다고 강조했다. 마키나락스는 이
     

“제조·국방 현장, 범용 AI로는 부족”… 윤성호 마키나락스 대표, 산업 특화 AI 전략 공개

6 de Maio de 2026, 04:52

윤성호 마키나락스 대표는 6일 열린 기자간담회에서 “피지컬 AI 시대는 이미 시작됐지만, 가장 먼저 현실화되는 곳은 휴머노이드가 아니라 제조 산업 현장과 전투 현장”이라며 “산업 현장은 일반적인 클라우드 환경과 달리 정밀도·신뢰성·보안에 대한 요구 수준이 높아 범용 AI만으로는 대응에 한계가 있다”고 말했다.

마키나락스는 산업 현장에서 AI 도입이 어려운 핵심 이유로 폐쇄망 환경과 현장 데이터 문제를 꼽았다. 제조 공장과 국방 시설은 외부 데이터 반출이 제한되는 경우가 많아 일반적인 클라우드 기반 AI 서비스를 그대로 적용하기 어렵고, 산업 장비마다 데이터 구조와 운영 방식이 달라 실제 현장 데이터를 충분히 학습하지 않으면 기업이 원하는 수준의 정확도를 구현하기 어렵다는 설명이다. 여기에 자동차 생산라인처럼 다양한 제조사의 로봇이 함께 운영되는 환경에서는 특정 제조사 솔루션만으로 통합 관리가 쉽지 않다고 강조했다.

마키나락스는 이 같은 환경에 대응하기 위해 자체 AI 운영체제 ‘런웨이’를 개발했다고 밝혔다. 런웨이는 폐쇄망 환경에서도 동작할 수 있도록 설계됐으며, 공장 내부 서버나 산업 장비 환경에서도 AI 운영이 가능하다는 설명이다. 회사는 이를 통해 데이터 수집·저장·학습·배포·운영 전주기를 통합 지원한다.

윤 대표는 “AI OS는 PC 시대의 윈도우나 기업용 ERP처럼 AI를 실행하기 위한 기반 소프트웨어”라며 “기업은 런웨이 위에서 수백, 수천 개의 AI 애플리케이션을 운영할 수 있다”고 설명했다.

마키나락스가 특히 강조한 부분은 산업 현장에서 검증된 레퍼런스다. 윤성호 대표는 “현장에서 실제로 동작하는 AI만이 의미가 있다”는 점을 강조하며, 회사가 초기부터 공장과 국방 등 실제 산업 환경에서 활용 가능한 AI 개발에 집중해 왔다고 말했다.

마키나락스에 따르면 현재까지 6,000개 이상의 AI 모델을 실제 산업 현장에 적용했으며, 이 과정에서 25테라바이트(TB)가 넘는 운영 데이터를 확보했다. 회사는 이러한 데이터를 기반으로 후발주자와의 격차를 확대하고 있다고 밝혔다.

윤 대표에 따르면 레퍼런스는 실제 기업 의사결정자들이 AI 솔루션을 도입할 때 가장 중요하게 보는 요소 중 하나다. 그는 이러한 레퍼런스 확보가 향후 성장의 기반이 될 것이라는 자신감도 드러냈다. 윤 대표는 “제조와 국방 분야 기업들은 한번 검증된 솔루션을 쉽게 바꾸지 않는 특성이 있다”며 “후발주자는 실제 데이터를 확보하지 못한 상태에서 높은 수준의 AI 성능을 구현해야 하는 구조적 한계가 있다”고 말했다.

최근 글로벌 빅테크 기업들도 제조·산업용 AI 시장 진출을 확대하고 있지만, 마키나락스는 산업 현장 중심의 기술 역량으로 시장을 공략하겠다는 전략이다. 윤 대표는 “글로벌 기업들은 현재 클라우드 기반 의사결정 지원이나 ERP·재무 영역에 집중하고 있다”며 “마키나락스는 공장과 산업 설비처럼 실제 운영 환경에서 활용되는 AI 개발에 집중해 왔다는 점이 다르다”고 말했다.

AI 에이전트 확산에 따라 보안과 거버넌스 중요성이 높아지고 있다는 점도 경쟁 요소로 제시했다. 윤 대표는 “에이전트는 자율성이 높아질수록 기업 입장에서 리스크도 커진다”며 “런웨이는 강력한 보안과 거버넌스 체계 안에서 AI를 안정적으로 운영할 수 있도록 설계됐다”고 설명했다.

마키나락스는 IPO를 통해 확보한 자금을 AI OS 고도화와 글로벌 사업 확대에 투입할 계획이다. 회사는 제조 특화 ‘다크팩토리 OS’와 국방 특화 ‘디펜스 OS’를 개발해 글로벌 피지컬 AI 운영체제 시장 표준 기업으로 자리매김하겠다는 목표를 제시했다.

글로벌 전략 시장으로는 일본과 유럽을 우선 공략한다. 회사는 지난해 일본 법인을 설립했으며, 현재 일본 자동차 제조사와 산업용 장비 기업 등 고객사를 확보했다고 밝혔다. 유럽 시장은 로봇 기업 쿠카(KUKA) 자회사인 디바이스 인사이트(Device Insight)와의 협력을 기반으로 확대 중이다.

마키나락스는 마지막으로 2027년 흑자 전환을 목표로 하고 있으며, 2030년까지 매출 1,000억 원 달성과 글로벌 매출 비중 20~30% 확보를 목표로 제시했다.
jihyun.lee@foundryco.com

  • ✇Security | CIO
  • Smart factories are here — but is your team ready to use them?
    Since the emergence of Industry 4.0 in 2011, manufacturing has undergone a digital transformation. Industrial Internet of Things (IIoT) sensors now allow machines and assets to communicate seamlessly, while artificial intelligence has become a core business enabler. Cloud computing provides virtually limitless processing power and storage, and big data analytics has become essential for strategic decision-making. By integrating data from ERP systems with real-time machine
     

Smart factories are here — but is your team ready to use them?

23 de Abril de 2026, 09:00

Since the emergence of Industry 4.0 in 2011, manufacturing has undergone a digital transformation. Industrial Internet of Things (IIoT) sensors now allow machines and assets to communicate seamlessly, while artificial intelligence has become a core business enabler. Cloud computing provides virtually limitless processing power and storage, and big data analytics has become essential for strategic decision-making. By integrating data from ERP systems with real-time machine data — via SCADA, PLCs, and other automated tools — manufacturing execution systems (MES) have paved the way for the modern smart factory.

Smart factories are not limited to MES alone but also cover other areas like energy management systems (EMS), video analytics-based plant safety, digital quality inspection using vision-based cameras, immersive technology-based shopfloor training, operational technology (OT) network, firewalls and other related tools.

If we go up the value chain, today factories are designed using digital twins with full process simulations and products are designed using product lifecycle management (PLM) platforms. Maturity of smart factories is an evolution, tightly linked with the digital transformation plan of the enterprise. Still 49% of the enterprises lack confidence in their future manufacturing strategy.

While visiting various plants, the disparity in digital maturity is often striking. In many business units, specific digital initiatives take precedence because they are driven by the immediate priorities or critical requirements of the end customer. In other instances, regulatory compliance dictates the roadmap. Ultimately, delaying a plant’s digital transformation can be a strategic choice; these are complex business decisions managed by CXOs based on broader organizational goals.

Having said this, based on Gartner’s Top 10 Strategic Technology Trends for 2026, digital and AI technologies will continue to be the fundamental for driving smart factories maturity. And according to IDC’s 2026 Manufacturing FutureScape, by 2027, 40% of factories’ operational data will be integrated across applications and platforms autonomously, due to increased standardization and the use of AI agents purpose-built for specific data.

In fact, I envision there will be AI agentic mesh in the smart factories, working under an AI orchestrator layer, either collecting or sharing data in a multi-agent AI environment, with human-in-the-loop (HITL) for critical business decisions.

Impact on the workforce skillset

In terms of coping with the impact of digital transformation, the world of the workforce on the shopfloor of factories is changing at a faster pace. The tasks and activities done by operators, supervisors, maintenance technicians, quality inspectors, material handlers and others need to be seen through digital, AI and smart factory lenses.

There is a growing realization within the workforce that the convergence of automation, AI, cloud/edge computing, and IIoT is fundamentally reshaping every manufacturing process. AI-driven shopfloor assistants have become increasingly common, guiding workers through machine maintenance, process automation, and quality checks. These digital tools are particularly vital during night shifts or off-hours, when fewer human experts are available on-site to provide support.

Over the last few years, I have observed manual quality inspections being steadily replaced or augmented by advanced vision systems. In fact, many modern machines now come with these cameras factory-installed. From robots performing thousands of precise welds on vehicle seating to the automated painting and injection molding of automotive parts, the shift is undeniable. Consequently, the workforce skillset required to drive digital transformation in these smart factories needs a comprehensive revisit. The sentiment of reskilling is well captured in the book “What Got You Here, Won’t Get You There,” though it’s more pertinent to managers or senior leaders.

Bridging the skillset gap

Through AI innovation, by 2031, over 30 million jobs per year will be redesigned – not eliminated. So, learning and development (L&D) leaders need to look at the talent development and retention strategies, which will stay relevant in the smart factories’ era and beyond.

Successful initiatives often involve learning and development (L&D) leaders collaborating with business unit heads and digital stakeholders to build a comprehensive transformation matrix. This matrix maps out the manufacturing processes most affected by AI and digital tools, identifies the relevant job roles, and aligns them with the necessary technologies—such as IIoT, cloud computing, Gen AI, agentic AI and computer vision.

From this matrix, the skillset gaps for the impacted roles because of process and technology changes are tracked and fed into the L&D talent development plan. This plan is developed at the BU/plant-level and the requisite investments on training and infrastructure are approved by the business head in conjunction with the digital head.

From my perspective, I feel immersive technology-based training is quite effective in smart factories. Virtual reality (VR)/augmented reality (AR) solutions have helped to cut down the training time by 20-50%, with full tracking of the talent proficiency. This information is fed into learning management system (LMS).

One of the most effective features is that the workforce skillset matrix is generated directly from the learning management system (LMS). This integration enables plant managers to assign operators to specific machinery based on their verified proficiency and skill levels. This automated allocation of production line personnel is becoming increasingly standard, effectively eliminating the risk of unqualified staff operating sensitive equipment. By ensuring the right person is at the right machine, organizations can significantly improve safety, ‘first-time-right’ rates and overall product quality.

Keeping the workforce AI-ready

The digitalization of manufacturing generates vast quantities of data. While IT and digital teams are responsible for ensuring this data is captured securely on scalable platforms like the cloud, it is equally vital that the shopfloor workforce understands the underlying dataflow. When operators grasp how information moves through the system, they can better support the integrity and efficiency of the smart factory.

Furthermore, the workforce must recognize that data quality is the foundation of any effective AI solution — whether it involves shopfloor assistants or predictive forecasting. Because AI models are trained on specific datasets for specific use cases, their output is only as reliable as the input. Enterprises must strategically determine whether these models should be trained exclusively on internal enterprise data or supplemented with broader industry and internet-based information.

The bottom line is that AI-based solutions help organizations to stay ahead of the curve in terms of differentiation, competitive edge, business decisioning, growth and so on. The upskilling and cross-skilling of the workforce, as per the talent development plan, should be updated and tracked from AI lens, especially as this technology is changing at a rapid speed.

The best practice I have seen being followed in the industry is when the digital/AI team works with the HR and BU teams to identify training for different sets of employee groups. Shop floor training on digital and AI, for instance, will be a lot more hands-on and manufacturing-focused compared to training for mid/senior level executives, where the focus will be about the technology, its impact on the business and how to stay abreast of it.

Industry-specific certifications in digital and AI technologies can significantly enhance workforce productivity and efficiency. To complement formal training, many organizations now partner with startup ecosystems on relevant business projects, giving employees first-hand experience with emerging tools. Furthermore, ‘AI playgrounds’ allow business units to democratize these technologies by applying them to live use cases. Ultimately, bridging the skills gap requires more than just academic instruction; practical, hands-on exercises are essential to ensuring an AI-ready workforce.

This article is published as part of the Foundry Expert Contributor Network.
Want to join?

  • ✇Security | CIO
  • Why planning structures must evolve in modern manufacturing
    Across many manufacturing organizations I have worked with, I keep seeing the same puzzling pattern. Companies invest in better forecasting tools. They implement advanced planning systems. They improve supply chain processes. Yet something strange still happens. Some components are overplanned. Others are repeatedly short. Production teams start expediting parts. Suppliers are pushed to deliver faster. Eventually, leaders ask the obvious question: If plannin
     

Why planning structures must evolve in modern manufacturing

22 de Abril de 2026, 06:00

Across many manufacturing organizations I have worked with, I keep seeing the same puzzling pattern.

Companies invest in better forecasting tools. They implement advanced planning systems. They improve supply chain processes.

Yet something strange still happens.

Some components are overplanned. Others are repeatedly short. Production teams start expediting parts. Suppliers are pushed to deliver faster.

Eventually, leaders ask the obvious question:

If planning systems are improving, why do these imbalances still occur — and why are teams still relying on spreadsheets and manual workarounds?

In my experience, the issue is rarely forecasting accuracy, execution capability or supplier performance. It begins with how planning parameters are defined inside enterprise systems.

Most ERP environments I have worked with still rely on static assumptions, while the real supply chain behaves dynamically. This mismatch between static planning logic and dynamic operational behavior is where structural imbalances originate.

The hidden problem: Static planning parameters

Across implementations, I consistently find that three tightly connected parameters drive planning behavior:

  • Planning Bills of Materials (Planning BOMs)
  • Lead Times
  • Safety Stock

These are typically maintained as master data, reviewed periodically and updated manually, generally once or twice a year. That approach may have worked in stable environments, but modern manufacturing operates under continuous change. Product configurations evolve, customer preferences shift and supply conditions fluctuate.

When these assumptions remain static, the system does not fail; it drifts. And that drift manifests as imbalance across components, time and availability.

Example #1: Planning BOM

In one environment I worked with, the Planning BOM assumed that 70% of orders used a standard PLC module and 30% used an advanced PLC. Over time, actual demand shifted and advanced PLC usage exceeded 50%.

However, the planning structure did not change, largely because updating it required significant manual effort and coordination across teams.

The result was not simply excess inventory — it was misalignment:

  • Overplanning of standard components
  • Underplanning of advanced components
  • Repeated substitutions and expediting

The forecast itself remained reasonably accurate. The imbalance emerged because demand was being translated through outdated structural assumptions.

More fundamentally, I have observed that Planning Bills of Materials, while central to ERP-driven planning, were never designed to capture the full complexity of manufacturing execution. Traditional BOM structures define what needs to be built, but not how it is built.

This limitation has been highlighted in patent US10832197B1, which introduces the concept of a “bill of work” to represent the actual activities, routing and process steps required for manufacturing. However, this type of execution-aware structural modeling is still rarely implemented in most ERP systems, which continue to rely primarily on static BOM definitions.

In my experience, this gap reinforces a broader point: Static planning structures alone are insufficient to model dynamic, real-world production environments.

Example #2: Lead time

I have seen cases where average demand remained stable at 100 units per week and lead time was assumed to be static at 10 weeks. In reality, lead time fluctuated between 8 and 14 weeks.

This did not just affect total inventory; it disrupted timing alignment:

  • Materials arriving too early for some components
  • Materials arriving too late for others

The issue was not quantity. It was synchronization across time.

Example #3: Safety stock

When shortages occur, organizations often increase safety stock. Most enterprise systems support this through simple mechanisms:

  • Fixed quantities
  • Coverage-based calculations

Safety Stock = Average Daily Demand × Days of Coverage

Both approaches assume relatively stable demand variability and supply risk.

However, real supply chains are not stable. Demand patterns shift, suppliers fluctuate and disruptions occur frequently. In this context, increasing safety stock often protects a distorted signal rather than correcting it.

In my work on inventory optimization, sometimes referred to as Garg’s Principle, I evaluate safety stock across the full forecast horizon rather than at a single point.

A simplified representation is:

Safety Stock = Target Service Inventory − Minimum Projected Inventory Across the Forecast Horizon

This approach identifies the lowest projected inventory point and ensures buffers protect that constraint. It transforms safety stock from a static buffer into a forward-looking stability mechanism.

In practice, I consistently see that increasing buffers alone does not resolve imbalance:

  • Some components become over-buffered
  • Others remain constrained
  • Overall inventory may increase, but instability persists

The problem is not how much safety stock exists; it is how it is aligned.

Individually, each of the above three examples (planning BOM, lead time and safety stock) introduces distortion. Together, they amplify it.

Why static planning structures break in a dynamic world

Many ERP planning systems were designed for environments where product configurations, supplier behavior and demand patterns changed slowly.

That reality no longer exists.

Today’s manufacturing environments operate in constant change. Product variants evolve rapidly, customer expectations shift quickly and supply chains face ongoing disruption. Yet many planning models still assume stable product mixes, fixed lead times and constant buffers.

This gap between dynamic markets and static planning structures is where imbalances begin.

At a broader level, this reflects a structural limitation of ERP-centric planning. ERP systems are highly effective at executing transactions and maintaining control, but they extend past data into the future using relatively fixed assumptions. As highlighted in Why ERP-Centric Planning Can’t Keep Up with Modern Supply Chains, such systems often struggle to keep pace when demand patterns, supply variability and product configurations change continuously.

In many cases, supply chains do not struggle because forecasts are wrong; they struggle because the parameters translating demand into supply decisions remain static or are not updated regularly or require huge manual efforts.

Execution systems cannot fix planning imbalance

Planning imbalances do not remain confined to ERP systems, they propagate across the entire manufacturing stack.

Manufacturing Execution Systems (MES) and shop-floor operations depend on the plans they receive. When those plans are structurally imbalanced, execution systems cannot correct them; they simply operationalize the imbalance.

This relationship between planning and execution has been widely discussed in the context of modern MES platforms, which act as the bridge between enterprise systems and real-time production environments, as explored in Manufacturing execution systems: A comprehensive guide to selection and implementation.

I have also discussed a similar pattern in Why your ERP still can’t solve inventory drift — and the architecture that will, where ERP systems struggle not because they are broken, but because they operate on outdated assumptions.

From what I have seen, once a structural error enters the system, it flows through:

Forecast → Planning BOM → ERP → MES → Shop-floor execution

By the time production begins, the imbalance is already embedded.

From static to dynamic planning architecture

For CIOs, I do not see the solution as replacing ERP systems. Instead, I see an opportunity to modernize the intelligence layer that feeds them.

In my experience, artificial intelligence can transform static planning parameters into adaptive models that continuously learn from enterprise data.

AI-driven planning systems can incorporate:

  • Historical configurations and production data
  • Sales inputs and forward-looking programs
  • Engineering changes and substitution patterns
  • Supplier performance and variability

Using these inputs, machine learning models can estimate the probability distribution of components and dynamically generate Planning BOMs that reflect real-world behavior.

In parallel:

  • Lead times can be adjusted dynamically
  • Safety stock can be aligned with forward-looking variability

In practice, this works through four steps:

  1. Build a structural signature from early demand signals
  2. Identify comparable configurations using historical data
  3. Predict component mix probabilities
  4. Generate a dynamic Planning BOM

ERP remains the execution engine, but the structure feeding it becomes adaptive.

When I experimented with dynamic planning approaches, the impact was structural:

BehaviorTraditional Static PlanningDynamic Planning
Component alignmentFrequent mismatchImproved alignment
ExpeditingFrequentReduced by ~30–40%
Production schedulesUnstableMore predictable
ERP- MES alignmentFrequent substitutionsImproved synchronization
Safety stock behaviorIncreasing without stabilityTargeted and stable

These results reinforce a broader lesson:

Planning challenges are not driven by lack of inventory; they are driven by lack of alignment.

Mini case study: Resolving structural imbalance

In one manufacturing environment I worked with, forecasting accuracy was strong and supplier performance was stable. Yet planning imbalance persisted.

At a system level, inventory appeared sufficient. However:

  • Critical components were frequently unavailable
  • Non-critical components accumulated
  • Production schedules required constant adjustment

The issue was not shortage, it was misalignment.

When I analyzed the system, I found:

  • Planning BOMs reflected outdated configurations
  • Lead times were fixed despite variability
  • Safety stock was increased uniformly

This created a cycle of persistent imbalance and expediting.

We shifted to a dynamic planning approach:

  • BOM assumptions aligned with actual demand
  • Lead times adjusted based on observed variability
  • Inventory evaluated across the planning horizon

Within a few cycles:

  • Imbalance reduced significantly
  • Expediting declined
  • Production schedules stabilized

The key change was not more inventory; it was better alignment.

A strategic opportunity for CIOs and supply chain VPs

From a CIO perspective, this represents a fundamental shift.

The question is no longer: “How do we improve planning tools?”

The better question is: How do we transform static planning parameters into adaptive planning intelligence?”

Because in modern manufacturing, planning structure is strategy.

Conclusion

Based on my experience, traditional planning systems rely on static assumptions, while modern supply chains operate in constant change.

The challenge is not about inventory levels; it is planning alignment.

When planning structures remain static, imbalances persist — even when forecasting and execution improve.

But when planning becomes dynamic, when assumptions evolve with reality, those imbalances begin to disappear.

The next era of manufacturing advantage will come not from more inventory or faster execution, but from dynamic real-time alignment between planning assumptions and real-world behavior.

This article is published as part of the Foundry Expert Contributor Network.
Want to join?

Jessica Ferreira Vicente (SEAT): “La gobernanza es un pilar que nos permite ganar el derecho a innovar rápido”

17 de Abril de 2026, 09:10

La inteligencia artificial (IA) juega un papel destacado en el proceso de transformación digital que están acometiendo las empresas españolas. Pero no se trata de digitalizar por digitalizar, tal y como explicó Jessica Ferreira Vicente, responsable de la Oficina Técnica de Transformación Digital de SEAT, sino de ser consecuentes con el proceso de desarrollo.

Más gráfica, imposible: “Al final, si tienes, y voy a hablar mal, lo siento, un proceso de mierda y lo digitalizas, vas a generar una mierda digitalizada. Tenemos que ser conscientes de que no se trata de digitalizar por digitalizar, igual que no se trata de poner IA por ponerla”, espetó.

De ahí que en SEAT, tal y como explicó, “trabajamos con modelos que mejoren la experiencia del cliente. A partir de esta ideas, miramos diferentes parámetros para priorizar las ideas que tenían más sentido y, acto seguido, realizar una definición técnica. Porque, insisto, no se pone IA por ponerla. Además, podíamos encontrarnos con que el chatbot generara alucinaciones y, en consecuencia, no diera siempre la misma respuesta. Lo que queríamos es que sirviera para más casos de uso, algo que sea industrializado”, precisó.

Un proceso que, tal y como detalló a los presentes, pasó por varias fases: “Primero, la gobernanza, que siempre ponemos del desarrollo. Se trata de un paso muy importante, pues si se desarrolla la solución sin pasar por controles se puede tener un riesgo importante”. “Una vez pasada esta etapa, llega el momento del desarrollo, que ha de tener en cuenta todas las lecciones que hemos aprendido de antemano. ¿Qué ocurre si no se desarrolla bien? Que llega el descreimiento en la IA. Por lo tanto, emponderar a las personas desde el primer momento es fundamental para que no se sientan infravaloradas. Es fundamental que se vean beneficiadas con estas soluciones”.

“Lo bueno es que, en todo momento —continuó—, el cliente, en este caso el usuario de esta solución final, ha estado involucrado. Por lo tanto, se siente parte de esta implementación, entra de una forma muy natural en su proceso de negocio porque hemos repensado el proceso de negocio juntos. No han partido de ellos dando una idea y nosotros pensando cómo hacerlo para luego entregar un producto hecho. Lo hemos construido juntos”.

Para concluir, Jessica Ferreira Vicente quiso dejar un mensaje final a todos los presentes: “En SEAT trabajamos en un modelo federado en el que IT proporciona las herramientas y las áreas de negocio mantienen este marco de trabajo gracias a estar cerca del proceso de negocio. La gobernanza no se puede escalar, y si lo hacemos sin ella es un castillo de naipes. Y sin olvidar el desarrollo del talento interno. La gobernanza no es para hacer que la IA sea lenta o para ralentizar la innovación, sino para ganarnos el derecho a innovar rápido. Porque, si no lo hacemos bien, es lo mismo que no hacerlo”.

  • ✇Security | CIO
  • Cargill deploys private 5G to aid factory AI and automation efforts
    Connectivity at legacy facilities can present significant challenges for manufacturing companies seeking to optimize operations on the factory floor. To remedy that, food production giant Cargill is tapping private 5G as a means for unlocking new levels of automation across its extensive system of factories, including the introduction of AI-powered robots. NTT DATA’s private 5G network will provide the backbone for the company’s factory connectivity strategy, which w
     

Cargill deploys private 5G to aid factory AI and automation efforts

10 de Abril de 2026, 07:01

Connectivity at legacy facilities can present significant challenges for manufacturing companies seeking to optimize operations on the factory floor.

To remedy that, food production giant Cargill is tapping private 5G as a means for unlocking new levels of automation across its extensive system of factories, including the introduction of AI-powered robots.

NTT DATA’s private 5G network will provide the backbone for the company’s factory connectivity strategy, which was launched in March 2025 and covers 50 of its 1,100 facilities as of February 2026. The company plans to add private 5G to more than 100 sites per year.

The network provides Cargill with reliable, low-latency connectivity to smartphones and tablets on factory floors and has open the door to experiments with AI-powered robots, including its deployment of Boston Dymanics’ Spot at its Amsterdam facility to automate inspections. The four-legged robot checks for hazards such as overheating equipment and looks for ways to improve worker safety.

Spot roams the factory in a preset pattern and builds a database of information about the conditions it finds there, says Robert Greiner, Cargill’s director of platform engineering for customer, commercial, and business operations digital technology.

“It’ll do vibration tests, it’ll do air quality tests, it’ll do a whole bunch of different measures of what the plant should look like in normal conditions,” he adds. “Because it’s doing that same path every day, it then starts building a database of what normal looks like and what normal doesn’t look like.”

Cargill robotics on factory floor

Cargill

Cargill is exploring other ways to bring AI to its factory facilities, many of which are decades old, Greiner says. Reliable connectivity will enable the company to retrofit the buildings with modern sensors.

“Whether it’s a motor that turns or a mill, they generate heat, they have bearings, and they have failures,” he says. “5G has lit up a large area of those plants that didn’t have connectivity out there.”

More coverage

Cargill turned to NTT DATA and private 5G because of challenges with traditional Wi-Fi at many of its factories, Greiner says. In addition to covering a wider area than Wi-Fi routers, private 5G networking provides better connectivity through thick walls and other obstacles than public cellular networks, he notes.

“In the manufacturing environment, when you get outside of what I call a carpeted space, connectivity becomes an issue,” he adds. “Coming out of COVID, with Industry 4.0, there’s been a need for advanced connectivity out there in the plant floor, and our model was struggling to get that connectivity.”

Cargill can now deploy one private 5G network access point to cover the same area as about nine Wi-Fi access points.  And while private 5G assess points can cost more than Wi-Fi equipment, additional savings come during installation, with a 70% reduction in cabling and setup costs, Greiner says.

“In our environment we mostly have to run that cabling in conduits, and we have all that infrastructure cost that has to go into the factory floor to enable that access point toward the other side,” he says.

Meanwhile, private 5G gives the company more control over its networks than public cellular networks would, he adds.

“If you’re running on that public network and you’re in the middle of Nebraska, then the school lets out and the school bus pulls up next to the plant and every kid starts streaming data,” he says. “You’re relying on that connection to do some process, but that cell tower could be overrun by the school bus that just happens to be sitting there at that critical time.”

Pen and paper no more

Private 5G also will enable Cargill to update major software platforms and other apps in a secure and reliable way, Greiner says. The company has had several small warehouses sprinkled around the world with no connectivity, and private 5G deployments will allow them to install ERP systems.

“These dark warehouses didn’t have Wi-Fi in them, and they basically were using a No. 2 pencil and a yellow pad for keeping track of the inventory,” he says. “They’re moving to SAP, they have an inventory management system now, and they had the ability to switch over to an electronic inventory system, a warehousing system.”

Cargill took a smart approach to deploying private 5G by approaching it as foundational infrastructure rather than a single-use technology, says Parma Sandhu, vice president of enterprise 5G products and services at NTT.

“Instead of building networks for individual applications, the company deployed connectivity across facilities so multiple use cases — connected worker, robotics, sensors, inspections, and worker tools — can run on the same network,” Sandhu adds. “That approach allows new capabilities to be added over time without rebuilding the underlying connectivity.”

Private 5G can use several slices of the radio spectrum, and NTT DATA works with customers to find the best spectrum for their needs, Sandhu says. Connections can vary from sub-300Mbps to multigigabit speeds, depending on the spectrum used, but throughput isn’t the primary concern on most factory floors, he adds.

“In industrial environments, reliability and consistency matter more than peak speed,” Sandhu says. “Private 5G delivers high capacity and low latency, but the real advantage is secure, predictable connectivity across large facilities with thousands of connected devices. That reliability is what enables automation, robotics, and real-time monitoring on the factory floor.”

Private 5G is gaining traction in manufacturing as factories embrace generative AI, agentic AI, edge AI, and physical AI, Sandhu says. “There has been an explosion in demand for OT data, which requires more compute power and a faster, more reliable, and more secure connectivity,” he adds.

The factory use case

Private 5G makes sense in factory settings, says Jason Leigh, senior research manager for the mobility team at IT analyst firm IDC. While the gap is narrowing, private 5G has given factories more control over network performance than traditional Wi-Fi, and it is also built on a zero-trust security model, he adds.

“If you’re deploying a private network, you can pretty much tune it to say, ‘This network is always going to give me 100 megabits down, 50 megabits up,’” he says. “You can get a little better performance and control who comes on and off the network. “

While outsiders can access a Wi-Fi network if they have the password, private 5G can authenticate at the device SIM level, Leigh says. “It doesn’t matter if you have the password,” he adds.

Private 5G also has advantages as factories adopt more automation and other digital transformation initiatives, Leigh says. While smartphones and tablets running standard applications may not need a specialized network, technologies like AR and VR can benefit.

“Where it gets interesting is when you move towards more automation, more robotics,” he adds. “When you’re running a high-speed factory line and you’re using video to scan for quality issues, with private 5G, you can run that at high speed.”

AI-driven maintenance will need stable connections, he says. “You want that real-time super low-latency connection to exchange the image with the processing and back,” he adds. “You don’t want 10 minutes before the data processing to say, ‘This was an error in this problem, and the product should have been rejected.’”

❌
❌