
Hanford Mission Integration Solutions recently deployed an artificial intelligence tool, Hanford AI Liaison, or HAL, across the Hanford site. The friendly logo for the technology – generated by the tool itself – is featured on the wall above the development team. From left are Jason Walli, business intelligence manager; John Lawson, AI developer; Myles Gregory, AI developer; Spencer Myrick, software developer; Evan Derrick, business intelligence project manager; and Michael Winkel, deputy vice president, information management services.
Courtesy Hanford Mission Integration SolutionsIt’s no coincidence that Hanford’s new artificial intelligence tool shares its nickname with the computer featured in the classic science fiction film “2001: A Space Odyssey.”
The name HAL, which is short for Hanford AI Liaison, was suggested by someone in one of the first test groups, and when it came to a vote on the top 20 options, the group was overwhelmingly in favor of HAL.
Who says nerdy tech users don’t have a sense of humor?
HAL’s name is not only a nod to the 1968 Stanley Kubrick film’s HAL 9000, a computer that could run spaceship operations, speak and play games with humans – and even overthrow the ship’s human inhabitants, but it plays a role in humanizing the technology.
The Hanford AI tool’s logo – generated by the tool itself – features a friendly robot holding a cup of coffee with its name on it.
Hanford’s AI tool doesn’t talk or control any machines, but it can condense emails and analyze and summarize complicated documents, saving Hanford’s human workers tons of time, according to Hanford Mission Integration Solutions (HMIS), which developed the system.
Hanford’s HAL is an AI tool based on ChatGPT released to thousands of workers this spring. It’s designed to be an internal tool, and it doesn’t share data outside of the Hanford site.
It was developed and launched in just months, said Michael Winkel, deputy vice president of information management services at HMIS. The concept was first envisioned less than a year ago, and three months later, the model was fully developed and rolled out to a small pilot group in October 2024.
Over the six months since then, the tool has gone out to more and more users until it was released to all 9,400 Hanford workers in mid-March.
Nearly 2,000 individuals submitted more than 35,000 prompts to HAL only half a month into its full sitewide release.
HAL was built based on the GPT-4o model, and it looks and feels very much like ChatGPT does, Winkel said. It can help produce written content for emails or communications; process large and technical documents and provide summaries; and pull information from internet data like a Google search would.
Unlike ChatGPT and publicly available AI tools, HAL is specific to the Hanford system and operates in isolation.
It “does not send information out to the external environment, which is obviously very important for security purposes for us, and allows us to attach files to it and analyze sensitive documents without the risk of that existing out on the World Wide Web,” Winkel said.
The tool is a time-saver for its users. Winkel said a conservative estimate is an average of between 15 minutes to an hour of time saved per person per week – with the number of workers at the Hanford site that equals a lot of time saved.
“It’s got just an inherent ability to raise the capabilities of people in areas that they would not only be proficient in. That’s kind of the beauty of ChatGPT in general,” he said.
Technology is always evolving, and even though HAL was just recently released, plans are already afoot for future upgrades.
One issue that users run into when using HAL as an internet search tool is that the data the original model was trained on is only as recent as October 2023, Winkel said. That means that any information after that isn’t available.
To get around that issue, Winkel said the plan is to connect HAL to the Microsoft Bing search engine to access current information.
The logo for the Hanford AI Liaison was created by the tool itself, asked to generate an image of itself. HAL can generate text and images in response to prompts, in addition to analyzing long and technical documents.
| Courtesy Hanford Mission Integration SolutionsAnother project in the works is “grounding” HAL to Hanford’s data, or giving it access to pull from that data to answer questions. As an example, if HAL had access to Hanford’s procedure system, “you can ask very open-ended questions like, ‘Tell me about my time off policy’” without having to search for the information yourself, Winkel said.
To get to that point, first the databases need to be combed through to make sure there aren’t problems in the data and to set good policies. That process is currently in progress.
“AI can just as much as it will exemplify and amplify your productivity, it can amplify inherent problems you have in data sets,” Winkel said. It’s important to be ready before grounding HAL to the data. “… It’s a very measured and strategic approach. We’re moving quickly, but we’re not moving recklessly.”
As other technology in the industry continues to update, HAL’s developers will keep Hanford’s tool up to speed as well. While HAL currently only processes and generates images and text, there may be an opportunity to incorporate speech recognition and text-to-speech features in the future.
“We can continue to modernize the HAL tool and keep it up with industry average and industry standards,” Winkel said.
Like any other software, updates will be tested before releasing, and users will experience minimal impact.
As the AI tool is used, workers at the Hanford site have the opportunity to offer real-time feedback. They can use a thumbs up or thumbs down in the tool to indicate whether a response is appropriate and accurate, or if the outcome was unexpected.
Users can share new features they’d like to see, and that feedback can be used in HAL’s future development, Winkel said.
For those who are less comfortable in using AI tools, an online training program has been created to not only familiarize users with the technology, but to explain how to use the right kinds of prompts to get a certain outcome.
“I’ve typically found there’s a lot of users that are very excited to use AI tools, and there’s a lot of users that are very nervous on using AI tools, so they’re not sure how to use AI tools, or they haven’t embraced it yet,” Winkel said.
He said it’s important to him that users are comfortable with the technology, and that it’s straightforward enough for the average person to use.
Overall, Winkel said the technology has been a success. Other federal sites have been asking about HAL and looking to use similar technology.
“We really want to be on that leading edge for the federal government here,” he said.