Ten Plays of Our Data & Analytics Approach
By the Data & Analytics Center of Excellence
- Key Concepts
- Play 1: Define a Vision
- Play 2: Obtain Leadership/Stakeholder Commitment
- Play 3: Evaluate the Current State
- Play 4: Develop Future State Requirements
- Play 5: Conduct a Gap Analysis
- Play 6: Create an Implementation Roadmap
- Play 7: Establish a Data Governance Structure
- Play 8: Create An Enterprise Data Model
- Play 9: Emphasize Rapid Prototyping
- Play 10: Embrace Continuous Process Improvement
- Related Resources
Building mature data analytics organizations across government is a necessary step to achieve the Cross-Agency Priority (CAP) goal. Leveraging data as a strategic asset will increase the federal governmentʼs effectiveness, facilitate oversight, and promote transparency.
The Data & Analytics Center of Excellence (CoE) drives innovation in data management, analysis and reporting for partner agencies. Its overarching goal is to enable data-driven decision making.
The team developed this 10-play Playbook so agencies can lead more effective government policies and programs. While not every play will apply to your agency, they will help you strive towards more data-driven decisions.
The Data & Analytics CoE developed this Playbook according to relevant laws, policies, and guidance, including the newly released Federal Data Strategy and OMB Circular A-11. The relevant directives provide the fundamental framework by which the CoE conducts discovery and implementation.
Throughout this document, we will refer to the following key concepts:
- Advanced Analytics - Using tools and technologies to predict future trends (i.e., predictive data and analytics, data mining, and artificial intelligence (AI) efforts).
- Data Culture - Organizational investment in data and analytics capacity and the cultivation of an environment where all staff are encouraged to use data to make key decisions.
- Data Governance - The practice of data management to ensure high data quality; key focus areas include data availability, usability, integrity, and security.
- Data Infrastructure - Collection of technologies (e.g., hardware, software, cloud services) used for data sharing and utilization.
- Data Visualization - Using visual elements (e.g., charts or graphs) to quickly consume key information.
- Technical Infrastructure - Collection of hardware, software, networks, cloud resources, data centers, etc., used to support information technology (IT) services.
Define a clear, realistic vision of your organization’s data analytics’ near-term and long-term goals. You’ll be able to assess your organization’s data analytics maturity level and develop a strategy to enhance it.
- At a high level, document the types of concrete outcomes that would contribute to mission success.
- Develop and continue to refine the overarching vision statement and conceptual framework to explain what data analytics could mean to the organization, in both the near and long-term.
- Who are your main internal and external customers?
- What are their primary business goals?
- What data limitations keep your organization from achieving success?
- What do you hope to achieve in the future?
- What sort of data culture exists in your organization?
- How does data culture align with your vision?
- What does success look like?
Obtaining stakeholder buy-in early ensures that participants understand desired outcomes and the requirements to achieve those outcomes. This buy-in requires a more open and transparent approach to exposing data and a decision-making structure that emphasizes the important role of, and improvement to, the tools that support it.
- Develop an introductory document with high-level vision, project scope, and timeline.
- Align project scope with the organization’s vision.
- Document high-level outcomes without listing overly prescriptive deliverables or achievements.
- Involve leadership from all key areas: IT, business lines, communications, web, customer service, human resources, and finance.
- Set realistic expectations for progress .
- Ensure sustained commitment from leadership.
- Provide regular updates on progress, challenges, and next steps.
- Who are the key organizational stakeholders?
- Did you include at least one executive or decision-maker from every core business area?
- Does each team understand its roles/responsibilities?
- Do the stakeholders share a cohesive vision?
- Is there a clear and effective management structure to coordinate and enable data-related or data-dependent projects across the organization?
- Does the team agree about the outcomes and their feasibility?
- Have you identified data champions within the organization who can engage across teams?
A current state analysis involves reviewing existing business functions, activities, and technology implementations that the business has either already carried out or plans to carry out. Assess current technical infrastructure and stakeholder roles in existing business processes to evaluate success areas, areas of improvement, and challenges.
- Conduct interviews and workshops with leadership and program-level staff to identify the key workflows, stakeholders, technology, and systems.
- Develop workflows outlining the current systems, process flows, platforms, and application portfolio.
- Capture how current system owners interact with existing business processes or data systems as well as current data and analytics-related policies and standards.
- Document stakeholder-identified success areas and pain points or challenges that impede their work.
- Document similarities and common systems across business functions.
- Assess the skills that employees acquire through on-the-job training that add value to the organization (i.e., human capital assessment).
- Who are your organizational stakeholders?
- Who are your customers and end users?
- Who are the most competent subject-matter experts (SMEs) in your organization utilizing data assets or analytics applications?
- What are your data assets (e.g., databases)?
- What is your technical infrastructure?
- What are the processes currently used to manage data flows?
- Who owns them?
- Is there existing documentation?
- What are the current training processes?
- Is the impact of training being monitored?
- Are there planned technology acquisitions or staff hires?
The future state requirements define what the organization’s data and analytics structure would ideally look like, based on leadership’s vision and strategic goals.
Do not be constrained by the existing state when considering an ideal, target state.
- Develop a list of desired future state requirements that map to the vision: systems capabilities, Key Performance Indicators (KPIs), process flows, platforms, applications, analytics techniques (e.g., data visualization, artificial intelligence (AI), operations research, etc.), and personnel capabilities.
- Consider what types of roles would have access to which types of data at various points within processes.
- Define a plan to track the impact of training on data and analytics.
- What business units have a high citizen engagement?
- What data process improvements do you need to improve organizational efficiency?
- Would more datasets or reporting support improved decision making?
- What datasets or reporting do you need?
- What business units are candidates for modernization through advanced analytics, such as predictive analytics?
- What skills does your team need to adequately support technical infrastructure?
- Can you use existing platforms to support current business functions?
The gap analysis is intended to identify the people, processes, and technologies that are required to move from the current state to the target state.
- Identify the gaps between the current and future state data architecture using the people, process, and technology terms.
- Identify the processes and applications that are missing or hindering the organization from succeeding.
- Identify the gaps in information flow within the organization, including stakeholders or system owners’ lack of knowledge.
- Identify gaps in skills among existing roles.
- Align business requirements to the organization’s strategic vision.
- Are there data access challenges that need to be addressed?
- Do the gaps identified provide clear business value?
- Does closing the gaps identified drive towards the ideal state?
- What business units have burdensome manual reporting processes?
- Is the organization addressing identified gaps?
- What are the staffing requirements for a given future-state analytics application?
Create an implementation roadmap that details the concrete milestones required to arrive at the target state. Use the priority areas, identified in the gap analysis, to identify important near-term milestones and evaluate the overall roadmap for putting activities in the right order.
- Identify a single role or individual to own the roadmap (e.g., CIO, data officer).
- Ensure that milestones in the areas of people, processes, and technologies are all considered and in the right order.
- Associate notional timeframes, resources, and requirements with milestones and associated activities.
- At least annually, review the roadmap to confirm or update priorities.
- Have key stakeholders within the organization reviewed the roadmap and provided feedback?
- Do the initial roadmap milestones address opportunities for showing the immediate value analytics brings in the highest-priority areas?
- How do activities in the roadmap align with ongoing activities or already planned acquisitions or hires?
- What are the key risks that could prevent the agency’s full execution of the roadmap?
- Are steps within your roadmap overly aggressive?
- Does your roadmap achieve maturity through measured, iterative improvement?
- What are your success factors?
- How will you measure them during implementation?
Establish a data governance group to standardize data processes across the organization, ensure that data roles are clearly defined, and encourage adoption of new technology and processes. Clearly assigning data-related responsibilities, including oversight, ownership, and championing is a key aim of the data governance structure.
- Establish an organization-wide data governance plan.
- Identify an executive-level steering committee, focus workgroups, and data stewards/system owners to champion data and analytics efforts at an organizational level.
- Document and ensure the agency is tracking data and analytics responsibilities, reporting requirements, and roles required by statute.
- Develop and implement a structure for lifecycle data management.
- Develop documentation on standard operating procedures, roles, and responsibilities at every operational level.
- Ensure a robust relationship with the Chief Information Security Officer (CISO).
- Ensure the agency follows a comprehensive plan for ensuring data privacy guidelines.
- Support implementation and monitor progress on data management tools and architecture, based on the implementation roadmap.
- Develop communities of practice to bring data experts together to share ideas and best practices.
- Develop a training plan to increase stakeholder adoption.
- Develop a continuous data asset evaluation.
- Develop a strategy and plan for the deployment of an Open Data program.
- Develop a communication plan at every operational level to inform stakeholders of data lifecycle activities.
- Does your agency have existing data governance practices?
- Does your agency have an existing data governance committee or council?
- Is your overarching vision for your agency’s approach to data governance well defined?
- Have you identified data champions or stewards for all key business functions?
- Are data standards clear, straightforward, and well communicated throughout the organization?
- Do you have Open Data champions in your agency?
- What are your procedures for ensuring steps are taken to protect Personally Identifiable Information (PII) in managed data sets?
- Is data quality quantified and reported to the data governance committee and leadership?
An Enterprise Data Model (EDM) is an integrated view of the data produced and consumed across an entire organization. It represents a single integrated definition of data, unbiased of any system or application. It is independent of how the data is physically sourced, stored, processed, or accessed.
The model consists of enterprise-wide subject areas, fundamental entities and their relationships, and unified terms and definitions. EDM unites, formalizes, and represents the areas important to an organization, and determines the structure by which data is governed.
- Identify and understand critical business data sets, their sources, and relationships.
- Establish who’s responsible for data storage and access by engaging both business and technical data stewards.
- Ensure data stewards and data architects from across the organization can provide support.
- Establish priority and analysis needed in EDM subsequent development.
- Conduct working sessions to identify, develop, and verify an initial set of data concepts and terms (i.e., develop Entity Relationship Diagrams (ERD), XML Schemas (XSD), and/or an enterprise-wide Data Dictionary).
- Conduct design review sessions to verify consistent adherence to enterprise standards.
- Ensure privacy and security procedures are robust and managed with the Chief Information Security Officer (or other entity).
- Does the EDM include a central metadata dictionary, or repository, that provides consistent, standard definitions for information?
- Is the foundation shareable, consistent, and reusable for the transaction processing systems, data warehouse, and XML data exchange?
- Will the model improve data custodianship by ensuring a consistent view of the information across the organization?
- Does the EDM represent a single integrated definition of departmental data structure—unbiased of any system or application?
- Will the model reduce cycle time, cost, data redundancy, and effort, while increasing consistency, precision, and customer satisfaction?
- How does the EDM govern inclusion and management of PII?
Using an agile approach to prototyping analytics solutions helps build consensus around technical approaches and validates customer or user demand. Our goal is to establish a self-service model for analytics and identify specific, incremental enhancements to mature processes and technologies.
- Use cross-functional teams of developers, testers, and business owners to prototype.
- Use minimally viable products (MVPs) as a standard for prototyping—these are functioning tools or models that provide real value, though on a limited scale, or that support a portion of a process.
- Prototype streamlined processes and MVPs independently first, then together.
- Use actual business data and real business processes, informed by insights from data owners.
- Prototype capabilities for actual end users and collect feedback throughout the development process.
- Adopt an agile process; work in development sprints of 1 to 2 weeks.
- Identify low-hanging fruit—what onerous processes, or high-value capabilities were previously identified that could serve as opportunities to prototype?
- Are the right tools available to prototype technical solutions quickly, such as drag-and-drop data manipulation and dashboarding tools?
- Do all project stakeholders discuss progress, roadblocks, and opportunities every day?
- Is leadership plugged into the process and able to provide feedback on prototypes before the development process ends?
- Who can grant access to required data resources?
- What equipment or credentials are required to gain access to necessary network or data resources?
Continuous Process Improvement (CPI) is the ongoing, constant enhancement of products, services, or processes applied throughout the implementation phase. These changes can either be incremental or breakthrough. The goal is to incrementally increase the organization’s effectiveness and efficiency through small steps that move towards a more mature state.
The keys to CPI are implementing feedback mechanisms that continually collect input from system and process stakeholders, coupled with a change management process that systematically implements and monitors the value of changes.
- Engage senior leadership and key staff to ensure they’re committed to CPI.
- Clearly define project platform.
- Document standard processes.
- Understand CPI relies on employees, not top management, to identify ways to improve.
- Ensure agile process implementation so you can iteratively incorporate feedback in product and processes.
- Are workflow diagrams available to assist with CPI development?
- What teams in the workflow will be affected?
- Have you explained that change to the teams?
- Can you measure and repeat the improvements gained through CPI?
- Will the organization as a whole reflect these improvements?
- Will you embed a robust training and feedback mechanism within the CPI?
Return to Updates