Job Description
LABOR CATEGORY #6 DATA ANALYST / ENGINEER IN SUPPORT OF ACT ARCHITECTURAL COHERENCE
Location:Norfolk, VA on Site
NumberofCandidates: One Candidate
Period of Performance:
Base Period: 05 February 2024 – 31 December 2024
Option Period 1: 1 January 2025 – 31 December 2025
Option Period 2: 1 January 2026 – 31 December 2026
Option Period 2: 1 January 2027 – 31 December 2027
The HQ SACT requires support in establishing the Enterprise Architecture function and reporting about on-going and future programmes and initiatives delivering the Digital Transformation Implementation Strategy. The HQ SACT is responsible for capturing capability requirements developing capability architectures in Support of the different NATO modernization and transformation programmes.
The support will consist of consolidating architectural data across multiple sources (e.g. specialized architectural tools and repositories, requirements management tools, project and portfolio management tools) to create reports, presentations and interactive, web published dashboards and heat maps illustrating architectural scope, connections, overlaps, gaps and delivery progress of multiple programmes and projects.
Tasking
- Gather and analyze data related to enterprise architecture and capability architectures: This includes data on capabilities, processes, data flows, requirements, programme portfolio.
- Create architectural dashboards and heat maps. Use data to visualize the architecture maturity, architectural links and gaps between initiatives, requirements fulfilment, status of programmes and projects, and identify potential risks and bottlenecks.
- Use data to create architectural roadmaps for capability development, identify the capabilities to support future initiatives and NATO Digital Transformation.
- Collaborate with enterprise architects and capability architects to develop and implement data-driven architecture strategies. Work closely with the architects to understand their needs and deliver actionable insights.
- Integrate data sources relevant for the enterprise architecture (capability architectures, taxonomies, requirements, and programme/project information).
- Develop, construct, test and maintain data pipelines and data processing tools such as databases, data warehouses and ETLs.
- 7. Transform data into formats that can be easily analyzed by developing, maintaining, and testing infrastructures for data generation.
- Configure, integrate and maintain data integration and analytics tools and shared repositories (e.g. PowerBI, KNIME, SQL Server and other DBMS’, SharePoint).
- Improve data quality and efficiency.10.
- Support evaluation of capability requirements and objectives and their delivery status.
- Perform additional tasks as required by the COTR related to the labor category.
Essential Qualifications
This role is a hybrid of data analysis and data engineering, and will require you to have skills in both areas.
- A University Degree in a relevant engineering, management, information systems, accounting, economics, finance, business administration, public administration, operations research, project management or related discipline. Professional experience in data analytics / engineering of min. 5 years will be accepted in lieu of degree in a relevant field.
- Demonstrable recent experience (at least 5 years in the last 10) in complex data analysis and processing. List most relevant projects supported, analytical artefacts produced, or data engineering tasks performed.
- Demonstrable recent hands on experience (at least 3 years in the last 10) in using modern software architecture and software development related to data science, analytics and data integration, e.g. Python, SQL, KNIME, Pentaho or similar and familiarity in working with semi-structured data (XML, JSON) and APIs. List most relevant projects supported (no more than 3), tools/languages used and nature of the source data (e.g. relational, free text, JSON/XML).
- Demonstrable recent hands on experience (at least 3 years in the last 10) in designing and implementing data warehouse and data lake solutions. Knowledge of different data modelling paradigms, e.g. relational, dimensional, triple store (semantic Wikis), NoSQL. List most relevant projects supported (no more than 3), tools used and data modelling paradigm used.
- Effective story telling via data, Experienced ‘data storyteller’. Provide most relevant examples (no more than 3) together with the type of audience, ‘data story’ presented and presentation goals and outcomes.
- Demonstrable recent hands on experience (at least 2 years in the last 5) with data presentation and visualization tools, e.g. Microsoft Power BI, Tableau, Kibana, or similar using high quality graphs and reports, charts, heats maps and interactive dashboards. List most relevant projects supported (no more than 3), types of visualizations, and tools used.
Powered by JazzHR
innXchqLbl