The rush to bring the benefits of AI and generative AI to federal technology applications has dominated the mainstream media. Solution and services providers to the federal government are racing to position and expand with new AI capabilities, which brings new challenges.
Unlike many other commercial solutions, CGI’s Momentum ® is specifically designed around the nuances of federal regulations and a leading choice for today’s federal agencies for budgeting, financial management, acquisitions, and asset management (as provided by CGI’s Sunflower solution).
At CGI, our approach with generative AI for our Momentum and Sunflower solutions focuses on three principles of choice, support, and security. For this first blog in a series, I will discuss the aspects of what “choice” means with our Momentum and Sunflower solutions.
An agnostic approach enables agencies to choose with freedom from lock-in
Every agency is unique with different missions, environment, technology. While entering into the AI realm for efficiencies is exciting, agencies need to proceed with caution as new capabilities bring new risks and vulnerabilities. The White House continues to provide guidance on this topic with its latest executive order.
At CGI, we understand that balancing business and mission priorities is a delicate process and federal agencies need a solution that is flexible to keep up changes in mandates, policies, and the technology landscape. Our approach to generative AI brings all the benefits to federal agencies, but with the highest levels of flexibility and choice. We’ve embedded application programming interfaces (APIs) and accelerator packages into our Momentum® and Sunflower™ solutions, enabling customers to leverage generative AI capabilities with their own data, using their own data models or models we’ve provided in the systems. In other words, the customer controls the data set and AI interface.
Our AI vision and Momentum ® and Sunflower ™ solutions roadmaps are aligned with the principles of remaining agnostic as well as bring-your-own-consumption model (not CGI-owned). As a systems integrator, our approach to AI is much broader with the ability to connect to any model, cloud, or tool and our federal clients have ultimate flexibility to seamlessly add new AI capabilities into their existing applications. This approach allows agencies to gain new capabilities and leverage competitive pricing models, without being locked into a specific approach, engine, or tool.
Choose an AI journey unique to your agency
The power of generative AI comes from learning using large sets of data; as a system ingests and analyzes vast amounts of data and information, it gains a wide array of choices it will later use when generating its own text. As our generative AI and accelerator packages are cloud and tool agnostic, using the right APIs enables the AI functions in any IT environment.
However, it is important to train the AI on data that is directly relevant to the system’s intended use. This is where Momentum and Sunflower clients can experiment and refine their approach, allowing the system to provide specific agency data and therefore training the system directly on the agency’s information parameters for expected outcomes and processes. This reduces the likelihood of “hallucinations” that can occur when generative AI models presents incorrect responses based on the highest-probability data or bias in large data sets. Our accelerator packages, which are prepackaged solutions that work against Momentum and Sunflower data, reduce the hurdles of adoption, whereby agencies can access generative AI benefits against the Momentum datasets with proven AI solutions.
Intentional use of algorithms to yield expected outcomes
Before deploying any application, agencies need to show how the algorithms produce specific output. That proof point could also become a point of improper disclosure.
Therefore, we urge agencies to be cautious about claims that combining multiple agencies’ data into a single training set is the way to go. To the contrary, agencies want to train algorithms only with the data sufficient to get your required outcomes.
Agencies must, in their AI programs, build in compliance with statutes and executive orders connected to Health Insurance Portability and Accountability Act (HIPAA) information and any type of PII and controlled but unclassified information (CUI). They should add any agency-specific prohibitions on information they are allowed to share or aggregate.
To be sure, thanks to our work at multiple agencies, CGI has developed big data sets. For instance, in our Momentum application, we handle data covering a significant percentage of federal spending. But we also pay strict attention to how users can easily query the data they need and retrieve only the appropriate data for the result they want—and are authorized to access.
We understand that each agency has its own PII and otherwise sensitive data. That is why we use controlled, measured approaches to each client’s application. Our aim is to enable agencies to create AI applications that make them more productive and efficient, without introducing undue cybersecurity or disclosure risk. We recognize that an agency’s data sets belong to the agency and are not to be shared where such sharing is prohibited.
AI has become more powerful and versatile than it once was, and federal agencies appear to be ready to experiment and try AI applications. Agencies that are careful, thoughtful and judicious in how they apply AI stand to reap the rewards.
Finding the right use cases
As much as any other technology, AI excels in some use cases, and is not helpful at all in others. Agencies should analyze possible implementations to determine where AI would produce quick, easy and significant results. Specific implementations and use cases allow low-consequence experiments. CGI’s federal experts are well-versed in helping clients define and implement use cases to understand benefits and optimize outcomes.
Some use cases that agencies can explore include:
- Predictive analytics—Many agencies respond to challenges reactively. With AI, you get ahead of events. The system sends warning signs, so you act proactively and minimize problems.
- Automate and streamline contract closeouts or other processes: With contract closeouts, the system recognizes when a device reaches its end of life or when the expenditure has hit its ceiling and automatically closes out the contracts rather than have an employee manually take those steps.
- Trend analysis: AI systems collect, consolidate, correlate, and deliver new insights. Agencies identify how their internal organizations have been operating and can see how to make them better and more effective.
- Natural language processing (NLP): Incorporating natural language processing into customer interactions provides self-service, reduces processing time and lowers errors, all of which improve the citizen experience.
A vision for the future
CGI is your partner to understand, develop, and implement generative AI within your agency to meet your goals. Our experts can provide guidance and insight into proven strategies from technology to implementation. Whether you’re seeking to optimize Momentum or Sunflower or create unique use cases to inform your agency vision, CGI is your proven partner.
For more information about CGI’s AI capabilities in the federal marketplace, visit our website.
For more information on Momentum and Sunflower, visit www.cgi.com/Momentum and www.cgi.com/Sunflower.
Back to top