Are you passionate about using data to accelerate product development, manage business growth, and improve the Chase customer experience.? Our Lead Data Architect role may be exciting opportunity for you!
As a Data Domain Architect, you will provide strategic thought leadership on how to develop and deliver data for their business area to support business operations, strategic objectives, and advanced analytics. You will define data (metadata), identify systems of record and authoritative sources, create data quality rules, create data flow diagrams, and apply firm-wide standards. You will create complex SQL queries to support data analysis and Python based AI/ML solutions when needed
The Commercial Real Estate (CRE) group is the nation's leading multifamily lender, leveraging its industry knowledge to offer best-in-class and cost-effective financing solutions across all major Real Estate asset classes with speed, ease and certainty of execution.
Job Responsibilities
- Define and execute a strategy for the development and delivery of product data to support strategic business objectives, business operations, advanced analytics, and metrics and reporting.
- Build a strong understanding of the data and its use within the business and across lines of business and functions, through collaborative partnerships with multiple stakeholders, including Product Owners, analytics leads, and business process owners. Provide subject matter expertise with respect to the content and use of data in the product and associated business area.
- Identify and prioritize the scope of critical data within their product, ensuring that the prioritized data is well-documented as to its meaning and purpose, and classified accordingly with metadata to enable its understanding and control.
- Establish expectations for the required accuracy, completeness, and timeliness of data within the product, and coordinate and influence internal and partnered resources to deliver data quality requirements.
- Create Functional and Technical Specifications, Epics and User Stories, Process Flows, Data Analysis, Mapping Documents, Implementation Plan, Agile artifacts
- Work in Agile environment while building Data Requirements Documentation, Design and Implementation of data models built with right tools, controls, quality, operational processes, and procedures to ensure the consistency and quality of CRE data needs.
- Develop and implement Data Domain (DDD based) modeling, data engineering, and data resiliency design standards for AWS data platforms for internal and third party data sources.
Required Qualifications, Capabilities, and Skills
- Data domain expert and drive to know everything about the data on the platform.
- At least a Bachelor's Degree in Computer Science / Information Systems / Engineering or related disciplines and minimum 6+ years of relevant experience in technical project & data solutions management
- Hands-on practical experience delivering system design, application development, testing, and operational stability with strong communication skills and the ability to distill information and results into simple concepts and actionable insights for a variety of audiences
- Demonstrated proficiency in software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.)
- Ability to evaluate current and emerging technologies to recommend the best data architecture solutions for the future state architecture
- Hands on experience in AWS Cloud environment with a strong passion for business intelligence and analytics with the ability to drive capability development using advanced data analytics techniques and statistical methodologies..
- Demonstrated success developing and delivering analytics solutions in a dynamic cross-functional environment along with controlled delivery of specific data content through multiple channels including dashboards, API's, Kafka.
Preferred Qualifications, Capabilities, and Skills
- Comfortable with streaming and big data concepts; Oracle, Java, Python, Spark, Flink, Kafka, HDFS, AirFlow, Elastic Search, Snowflake, Cassandra and AWS Cloud Services (MSK, Aurora, DynamoDB, Redshift, etc.)
- Excellent knowledge of Domain Driven Design principals for creating conceptual and logical models to describe a particular domain of data and use these models to inform the physical design of data-related projects; consult enterprise and industry models.
- Advanced knowledge of architecture and one or more programming languages
- Experience with enforcing standard data modelling discipline, processes and metrics to support integrity of key data, and management oversight driving data quality programs.