Localization strategy
AI, ROI, and More: 5 Big Insights from tcworld
It’s fair to say that the localization business rarely sits still. With the year drawing to a close, it’s always worth stopping to take a look at the business landscape and see which trends and new technologies are set to drive change in the industry, and how we can be prepared to tackle them. Fortunately, our industry is home to some great conferences to help you stay abreast of the latest trends.
Recently, tcworld conference (and the tekom fair), brought together technical communication and localization experts to discuss the biggest emerging trends and technologies. Sessions this year centered around the responsible use of AI, improving content impact, and refining documentation and processes. With a mix of workshops and talks, the focus was very much on actionable strategies. Here’s a closer look at the big trends and key takeaways from this year’s conference:
Tackling misinformation in the age of AI
With the rise of AI-driven content, ensuring accuracy and combating misinformation has become a top priority. Several key sessions looked at AI quality and user experience, and shed some light on ways technical communicators can proactively safeguard against misleading content.
Countering fake news through better UX
In a session on misinformation, Ray Gallon from The Transformation Society argued that effective UX strategies can help counter fake news. He stressed that presenting accurate information alone is insufficient; communicators must appeal to users’ emotions. Techniques like user journey mapping and inferential learning help create engaging, accurate narratives without sensationalism.
Addressing AI errors at the source
Regina Preciado’s session “Your AI Isn’t broken, your content is” highlighted the integral role of content quality in enhancing AI-generated documentation. Looking in particular at AI hallucinations and inaccuracies, Regina honed in on poorly structured content, advising businesses to establish a single source of truth approach to all of their documentation and content – no matter its intended audience.
She emphasized how always referring to an original source helps AI systems provide consistent, reliable outputs, and how reliable AI can help secure trust and future investment internally.
Enhancing AI accuracy with ontologies
Prof. Dr. Martin Ley and Max Gärber from PANTOPIX emphasized the importance of ontologies—structured frameworks that define relationships between concepts—to improve the accuracy of generative AI. Product ontologies serve as a foundation for large language models (LLMs), ensuring they interpret product data more reliably.
For communicators, this means that using structured data frameworks can help mitigate errors in AI-powered chat applications, adding precision to user interactions and reducing misinformation risks.
Multiple sessions this year stressed the importance of structured, rigorous data inputs when training AI and building processes. From proactive content verification to UX design, it’s important that companies plan ahead in order to create and maintain credible, reliable AI-generated communication.
Deep Dive
Proving Localization Value: Six Key Metrics Every Localization Manager Should be Sharing
Learn from experts how to track and share key metrics like turnaround time, quality, customer satisfaction, and more to demonstrate the impact of localization on business success.
Maximizing ROI and quality in global communications
As localization becomes a more crucial component of internationalization strategies, ROI is increasingly on the agenda, with more financial eyeballs on the function than ever. Beyond typical localization metrics, several speakers discussed how to quantify ROI more effectively, and how financially savvy approaches can actually impact the user experience and the fortunes of an entire business.
Linking localization to ROI
In an interesting session linking localization efforts directly to business goals. Translate.One’s Quality and Product Manager, Christiane Schaeffler highlighted the need for both quantitative metrics (like market expansion and customer retention) and qualitative indicators (such as enhanced user experience).
By aligning around customer satisfaction, rather than relying on siloed metrics like turnaround time, it becomes possible to build more compelling business cases for localization investments, and fine-tune strategies to improve content relevance and impact.
Enhancing accessibility with LangOps
Vivien Krämer and Jochen Hummel gave a wide-ranging intro to Language Operations (LangOps). LangOps combines machine translation, advanced text analytics, and multilingual knowledge systems to optimize communication efforts at scale. Exploring its recent adoption at Roche Diagnostics, they explained how LangOps has enhanced content accessibility across languages, and proven useful in standardizing regulatory compliance. By unifying language resources, LangOps supports product localization and also assists with search and customer support, making it a versatile enterprise tool.
Tracking localization ROI, leveraging AI-powered LangOps, and fostering cultural competence are crucial for ensuring effective global comms. This approach ensures that localized content meets quality standards while resonating with diverse audiences, maximizing both engagement and ROI.
Optimizing documentation processes and efficiency
As content grows more complex, documentation teams face the challenge of managing reviews efficiently. These sessions shared techniques to streamline review cycles, embed quality assurance early, and prioritize sustainable content design.
Tackling review challenges in technical documentation
Dr. Saul Carliner from Concordia University identified five common issues in documentation reviews, such as contradictory feedback and unresponsive reviewers. He recommended solutions like setting clear review guidelines and holding pre-review meetings to clarify expectations, creating more relevant and timely feedback. Carliner also highlighted the importance of cultural competence, helping communicators assess cultural sensitivity, ethical considerations, and adaptability for inclusive, globally resonant content.
Embedding quality assurance early: The shift-left approach
Valentina Turra and Daniela Fleck from Philips shared their “shift-left” approach to localization, which emphasizes proactive quality checks early in content development. This strategy allows teams to resolve issues early, producing clear, consistent documentation that meets user needs. At Philips, this approach has improved documentation quality and streamlined user experience.
Promoting sustainable UX in documentation
Nolwenn Kerzreho from Madcap Software advocated for minimalist, sustainable content design to reduce information overload. By focusing on essential information, technical communicators can create clearer, more user-friendly documentation, enhancing engagement through simplicity.
All of these sessions underscored the value of refined review processes, early quality checks, and sustainable design principles, enabling teams to produce efficient, user-focused documentation that improves workflows and user experiences.
Enhancing user experiences with data
Long used for customer-facing applications, data-driven insights are now changing the game for technical documentation. Several sessions illustrated how technical communicators can use data, JavaScript, and tailored workflows to create content that resonates with today’s users.
Enhancing documentation through data
Rachael Hewetson and Sophie Sofce Kohl from SAP highlighted how data-driven insights guide documentation improvements. By analyzing metrics like page views and search trends, SAP identifies content gaps and reduces support requests. This approach enables communicators to prioritize updates that directly address user needs, resulting in documentation that is both relevant and user-friendly.
Empowering writers with JavaScript
Collibra’s Ken De Wachter encouraged technical writers to learn basic JavaScript to add interactive elements such as tabs and dropdowns to documentation. These elements make content more engaging and user-friendly. De Wachter emphasized that even a small amount of coding knowledge can enable writers to meet modern expectations for dynamic, interactive documentation.
Customizing DITA DocOps for corporate efficiency
Dia Daur from A-Jour-Net Inc. discussed how corporate teams can optimize their DITA (Darwin Information Typing Architecture) workflows. Customizing DITA DocOps allows for efficient content management, enforcing taxonomy standards, simplifying navigation, and facilitating mass updates. This approach ensures content consistency and adaptability across various channels, which is essential for large teams.
Combining data insights, interactivity, and optimized workflows empowers technical writers to deliver engaging, user-centered content. These methods help meet the high expectations of today’s users by enhancing the usability, relevance, and accessibility of documentation.
Leveraging simplified language standards
Clarity in technical documentation is especially important when dealing with multilingual operations teams. A pair of useful sessions highlighted how language standards like Simplified Technical English (STE) support clearer communication and reduce translation errors.
Writing procedures in Simplified Technical English (STE)
Daniela Zambrini and Orlando Chiarello from eXeL8 and STEMG introduced ASD-STE100, a controlled language standard designed to simplify technical English. They outlined STE’s structure and vocabulary limitations, showing how it eliminates ambiguity for readers. Through practical exercises, attendees saw how STE enhances understanding for non-native speakers and translators, enabling them to convey technical details accurately.
STE’s role in AI-assisted content creation
Orlando Chiarello also joined Jennifer Bennor to discuss the importance of STE in AI-driven content creation. As AI increasingly supports translation and content generation, standards like STE are essential to maintain clarity. By minimizing vocabulary and simplifying structure, STE helps prevent AI misinterpretations, ensuring content remains accurate and consistent across languages.
Simplified Technical English (STE) plays a vital role in creating clear, translatable documentation. Controlled language standards like STE are critical for ensuring accuracy, particularly as AI becomes more involved in multilingual content delivery.
This year’s tcworld conference emphasized the need for structured frameworks, from UX design to STE, in managing misinformation and leveraging AI effectively. Sessions underscored the importance of measuring localization ROI, adopting LangOps, and nurturing cultural awareness to ensure content resonates with diverse audiences. Additionally, prioritizing documentation efficiency and integrating data-driven insights and interactivity were highlighted as essential steps for technical communicators aiming to meet the demands of a globalized digital landscape.