The Client is a national land warfare force, which comprises of nearly 80,000 regular, trained personnel.
- Nearly real-time, two-way translation of speech
- Possibility to translate text captured with the device’s camera
- The option of adding additional words and terms to the built-in dictionaries
- Translation functionality available in the offline mode
The Client's infantry is regularly deployed overseas, often to countries where many different languages are spoken. Being able to converse with the local population is obviously of critical importance, as it allows personnel to interact effectively, build trust through mutual understanding, and gain more detailed situational awareness. Timely, local access to information can be critical to mission success and has the potential to save lives.
Human interpreters, working alongside Army personnel, currently facilitate this in-person communication. Access to interpreters can be limited, however, and even where an interpreter is available, the interpreter may not speak the full range of languages encountered.
Accurate, reliable AI-based language translation has recently become available and has been identified by the Client as a potentially useful tool. However, the fact that such services are typically cloud-based raises potential showstopping concerns, including:
- Conversation data, which may be extremely sensitive and therefore should not leave the Army’s domain, would be transferred over the public internet, processed and potentially stored in the public cloud.
- Connectivity may not be available in an operational environment, rendering an online-only solution of limited use.
Following an introduction by Microsoft to the Client’s innovation team, Objectivity conducted an AI workshop where Army personnel and Objectivity discussed the challenge and identified potential high-level solutions. Requirements were refi-ned during subsequent meetings and the scope for a Proof of Concept was agreed. We then built an Android application that is capable of offline, very near real-time translation of:
- Speech from in-person conversation, including split-screen two-way translation of both parties. This functionality can be operated via external speaker’s controls, making the conversations more streamlined.
- Speech captured via an audio stream from on-device media.
- Text from images and video captured via the device’s on-board camera.
- It is also possible to modify available dictionaries, adding specialist terms and abbreviations.
Another useful functionality that can be used in the offline mode is the commands module. Using it, with a simple button press, the users can playback pre-recorded commands in various languages, like “stay there”, “don’t move”, or “show me your ID”.
The application automatically performs speech-to-text transcription and records all audio files with corresponding GPS coordinates and device ID (IMEI).
It is also possible to capture audio from an external microphone, which can offer improved audio quality over the device’s inbuilt hardware. The software integrates easily with other devices, like an external speaker that can serve as a microphone, speaker and a conversation control mechanism, or Epson Moverio Smart Glasses that make it possible to capture images and show the translation of captured images on the glasses’ screen.
The application is built around Microsoft AI – it leverages the Microsoft Translator SDK, with support for individual languages provided by offline language packs. When used in the on-line mode, the application has access to a wider variety of languages and can automatically detect the target language. The Optical Character Recognition (OCR) module can also recognise handwritten texts.
The proof of concept was extremely well received. Objectivity has secured a second contract, which the Client will use to build on the Proof of Concept, adding support for additional use cases.
You may also be interested in
The Client, a national land warfare force, was looking to create a Proof of Concept (PoC) for their innovative AI-powered voice assistant idea. If testing the concept against technological possibilities would produce the desired effects, it could revolutionise the way the infantry prepares for missions.
Objectivity’s involvement in the Microsoft’s AI for Accessibility programme led to us being introduced to Leonard Cheshire. Leonard Cheshire is an international charity and works in communities to support people with disabilities into employment.
The Department of Agriculture, Environment, and Rural Affairs (DAERA) wanted to employ Artificial Intelligence to optimise their farm animal tracking processes. The first step involved partnering with a technology partner who would help them find the best way forward.