Atlas AI leverages new technology. Although GPT models have been around for some years now, to say both parties are on the very cutting edge of implementing Generative AI is a correct and factual statement.
To run any project properly in a diligent manner, specific processes should be created and followed based on best practice and experience. AI is no different.
ClearPeople have been working in this sector for over two decades, however the new AI technologies present new opportunities for us and our clients - both parties are tied to it success. As such, we have utilized our collective knowledge and experience to establish a process for implementing our AI tools in Atlas, most of which focusses on discovery and education. A lot of this process is driven through collaborative workshops, a key output being documentation to baseline the expected Scenarios (use cases) you are expected to take value from when setting up and interacting with AI. But we also need to cover training, governance, and any unique policies or protocols your organization may have in place related to both AI or content
The way we are presenting the implementation of Atlas AI is through a Pilot to help ensure we are planning for success and validating accordingly, prior to pushing the Atlas AI toolset out to end-users.
This article will discuss the implementation process of Atlas AI in 3 sections:
- Implementation Process, High-level plan & key workshops
- Discovery process in more detail (business workstream)
- Technical Deployment process in more detail (technical workstream)
Implementation Process & High-level Plan
The implementation process for Atlas AI toolset is split into 3 iterative phases: Discovery. Experience & Confirm. Alongside the technical workstream to deploy Atlas AI and Microsoft Open AI
High-level Plan
Please note this is a high-level indicative plan. The necessary timeframe may need to be extended or stretched based on how prepared you are for both the technical and business workstream
Week 1
- Kick-off (1 hour)
- Planning Session (1 hour) - both to prepare for the project but also the wider use of AI
- Optional Atlas AI demo & Art of the possible (1 hour)
- Start technical deployment of AI tooling and infrastructure (ongoing)
Week 2
- Discovery Session (90 minutes)
- Continue technical deployment (ongoing)
Week 3
- Playback session to verify and validate scenarios (60 minutes) - signals the end of preliminary Discovery phase
- Validate technical deployment (ongoing)
- Atlas AI Admin training 1 - introduction, set-up, permissions & governance - signals start of Experience phase
Week 4
- AI Set-up & configuration against confirmed scenarios
- Configuration and set-up of any necessary Atlas workspaces and AI-ready content
- Atlas AI Admin training 2 - How to use the Atlas AI Assistant and Intelligent Knowledge Studio (IKS)
Week 5 onwards
- Ongoing touchpoints to set-up needed scenarios, training, Q&A, assessments and validation
- Feedback and alterations
- Approval of existing scenarios that have been solutionised and are working in Atlas AI
- User Training
- User testing
- Feedback & amends
- Internal governance plan, including technical and business ownership and administration
- Internal communication plan
- Roll-out to end-users
Pre-requisites:
- Deployment of AI and basic set-up with permissions so that your stakeholders have the necessary level of access
- Scenarios need to be discussed, agreed, and baselined. This will provide direction for what we set-up and how
- Content being available. Content will need to be available inside Atlas workspaces for the AI to index.
- Understanding AI GPT models - this is not Atlas technology and therefore the settings and configuration are generic. A knowledge of these is beneficial for the project but will be essential for technical ownership and administration
- Ownership, both from technical and business side
Discovery Process in more detail
There is supplementary material for this discovery process which you will receive, mainly in PowerPoint slides and Excel spreadsheets. This provides more detail, however it is useful for us to spell out this process below at a high-level so you can plan accordingly.
-
Planning Session - Discussion on what is needed and when, and who is responsible for what. Usually working with Project Manager and Atlas and/or AI owners, as well as technical owner.
-
2x discussions:
-
- AI planning. Overview of discovery and necessary documentation on Scenarios, presenting high-level questions such as: What kind of content are you looking to bring into AI and how do you expect users to leverage this content? We aim to understand what your expectations are and then to plan accordingly. We will use this time to address any risks or concerns, and ensure we all leave feeling more confident in the AI implementation.
- Technical deployment. Discuss and baseline plan for deployment of Azure OpenAI and Atlas AI code and infrastructure.
-
-
Any other questions you would like to ask and overview of any requirements already known.
-
----------------------------------------------------------
-
Discovery Session - these sessions are designed to collaboratively explore potential AI use case scenarios across your groups. We’ll walk through key areas such as:
- Business scenario and user objective, data sources and content inputs, potential requirements for prompt structures and interactions, governance (Ownership, permissions, and compliance), administration and scalability, success criteria and expected outcomes.
- Our aim is to translate these insights into well-defined, actionable AI Scenarios using our standard scenario documentation format which we will leverage during the build to help inform configuration. This will allow us to accelerate prototyping, reduce ambiguity, and help early alignment on measurable outcomes.
-
- Scenario Template & Examples
For each proposed use case scenario, we’d like your teams to think through and prepare the following:
-
As a..... What user group or role is this scenario looking to address
- This might not be that complicated if, for example, each bid scenario just says 'As a member of the Bids team looking to develop new business', however this may flex depending on the scenario.
-
I want to... What is the end-user trying to achieve?
- Describe the user’s intent or task.
- Include any relevant job roles or personas.
- What problem are they solving or decision are they supporting?
- What makes this scenario important or high-impact?
- Example: I want to generate a customized case study summary for client proposals based on prior engagements
-
Using...What content or data will the AI use?
- What are the input materials and format? (e.g., PDFs, SharePoint docs, web pages, expertise information or Biographies)
- Are there structured data sources (e.g., databases, spreadsheets) involved?
- Taxonomy - Are there any labels groupings that can be leveraged? E.g. If there's Case Study documents, we can tag these with 'Case Study', or if different Bid material is geared towards different practice areas, they can be tagged accordingly. Jurisdictions, Office information, client sector, and engagement type are also common considerations
- Example: Using a library of completed case studies stored in Word documents on SharePoint, with Case Studies tagged as Case Studies within Information Type.
-
Setup Considerations...What rules or constraints should the AI follow?
- This is where requirements are listed. This is different to 'I want to...' because the users may not be aware of overarching policies or requirements
- Should the AI adopt a specific tone or writing style?
- Are there compliance, ethical, or data access constraints?
- Is there a preferred angle or point of view, e.g. always answer from the firm’s perspective? Ask follow up questions if more information is needed, suggest other topics or considerations
- Example: The volume and scope of the overall content is large and diverse.
-
I'll Know This is Successful When...How do we measure success?
- What outcomes will indicate this scenario is working well?
- Are there specific KPIs or quality benchmarks?
- Consider including usability factors too, e.g. ease of copying, clarity of output.
- Example: Responses must include citations and be copy-paste ready for Word documents used in bids.
-
Outcome Considerations...Are there secondary outcomes or comparisons to consider?
- This may be more generic information, however each scenario may have its own needs in terms of comparing the outcome to other alternatives or existing processes
- Should the AI suggest alternative solutions or summarize trade-offs?
- Are there human-in-the-loop steps that need to be preserved?
- Example: AI-generated summaries still need to be approved by the respective user's manager before being sent to any clients or submitted to the Bid library.
Prioritization for the Pilot
Each group is encouraged to submit as many scenarios as they see fit, but for the initial phase, we recommend selecting 2–3 high-value scenarios per group to pilot. The remaining ideas can be documented and prioritized for later stages.
We'll work with you to evaluate and score these based on impact, feasibility, and readiness to ensure we focus on the most promising opportunities first.
What We Need From You
Please share this guidance with the relevant teams in advance of our discovery calls. The more detailed and thoughtful the inputs, the better we can tailor the solution to your needs — and the faster we can start building. It's a 'more detail the merrier' scenario. The more considerations, requirements, outcomes, and general information is brought, the more detailed the scenarios will be and the more efficient we can be in the session.
----------------------------------------------------------
Playback session - to verify and validate Scenarios
- Play back and review documentation for any clarifications needed, and to reach agreement on how we reach and plan for each use case to help ensure a positive outcome in the agreed timescale.
- Discussion on success criteria and how to assess value
----------------------------------------------------------
Training Sessions - to introduce Atlas AI Assistant and the Atlas Intelligent Knowledge Studio to the business and technical users for this AI pilot workstream
-
- Atlas AI Admin training 1 - introduction, set-up, permissions & governance.
- Atlas AI Admin training 2 - How to use the Atlas AI Assistant and Intelligent Knowledge Studio (IKS)
- Atlas AI Admin training 3 - introduction and run through of configured Knowledge Collections and Scenarios in play
Technical Deployment Process in more detail
- This process can be run in parallel to the above Discovery (Business) process
- The technical deployment of Atlas AI is dependent on following our Atlas Runbooks. If you are an existing client you should be well aware of this process. If you are a new client and are yet to deploy Atlas or Atlas AI tooling, we will support you through this process.
-
The Atlas Runbooks have specific sections on AI. These will need to be followed and the tooling deployed and set-up accordingly. The Runbook will ensure the Atlas code and infrastructure necessary for the AI is set-up and available. Your designated Atlas lead (not necessarily AI workstream lead) will provide you the appropriate information in good time.
- There is a dependency for AI to deploy the necessary Microsoft Azure infrastructure in your 365 tenant. This is primarily in relation to the Azure Open AI service. Please follow these article below. These can be started anytime as they run independently to the Atlas deployment and set-up.
-
- Atlas AI - Azure Step 1: Review access to Azure OpenAI in your Azure subscription and request access to Microsoft (Atlas 6.0+)
- Atlas AI - Azure Step 2: Compatible regions for the Azure Open AI service (Atlas 6.0
- Atlas AI - Azure Step 3: Manual creation of Azure OpenAI in your subscription (Atlas 6.0+)
Comments
0 comments
Please sign in to leave a comment.