Select Page
Hello!

I’m Valentine nde

Data Engineer

Minneapolis, MN

About me

Data Engineer Minneapolis, MN

DATA ENGINEER WITH THE A passion for Turning Data into meaningful information.

Quisque velit nisi, pretium ut lacinia in, elementum id enim. Donec rutrum congue leo eget malesuada. Vivamus suscipit tortor eget felis porttitor volutpat. Vivamus suscipit tortor eget felis porttitor volutpat. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Vivamus suscipit tortor eget felis porttitor volutpat. Donec sollicitudin molestie malesuada.

Experience

Lead Data Engineer Consultant

VISIONTREE

December 2020 – Present

  • Led a team of five data engineers to build and maintain scalable and reliable data pipelines for ingesting, processing, and analyzing large volumes of structured and unstructured data from various sources.
  • Design and implement data models and schemas using SQL and NoSQL databases to support business intelligence and analytics needs.
  • Develop and optimize ETL processes using Python, Spark, and Kafka to transform and load data into Data warehouse and Data Lake.
  • Utilize AWS services such as S3, EC2, EMR, Lambda, Glue, Athena, Redshift, and Kinesis to create cloud-based data solutions.
  • Collaborate with data scientists and analysts to provide data access and support for machine learning and reporting projects.
  • Monitor and troubleshoot data quality issues and performance bottlenecks using tools such as Airflow, Grafana, and Splunk.
  • Partner with Business Intelligence and Data Engineering leads to establish a strategic end-to-end solution.
  • Collaborate with the Enterprise Data teams to provide user acceptance testing, maintain data quality and advance the technical toolset.
  • Integrate and aggregate complex data from multiple data sources and platforms to enhance customer access to information and promote a data-driven culture.

LEAD APP DATA ENGINEER

WELLS FARGO & COMPANY

August 2016 – November 2020

  • Managed 3 cross-functional teams to work in close collaboration with analytics, engineering, and stakeholders
  • Automated infrastructure management with Terraform, reducing deployment time by 51% in the AWS environment.
  • Analyzed data using Looker to proactively identify anomalies and limit discrepancies by 26%.
  • Developed CI/CD pipelines using GitHub, increasing development efficiency by 34% through automated testing.
  • Designed AWS Data Pipeline workflows for efficient extraction, transformation, and loading, which reduced data processing time by 42%.
  • Directed day-to-day operations of data-dependent systems
  • Ensured 100% of data was processed and transferred on time
  • Processed, manipulated, stored, and parsed data in 6 data pipelines
  • Designed, developed, deployed, and maintained data services for 20 pipelines
  • Developed 15 pages of best practice documentation with approaches to support continuous process automation for data ingestion and data pipeline workflow
  • Created and presented reports, analyses, and presentations to 9 stakeholders, including executives
  • Performed source system analysis, data profiling, and design mapping logic between source systems and 3 data warehouses
  • Created data extraction, cleansing, and load programs to move data from source systems to 3 data warehouses

SENIOR DATA ENGINEER CONSULTANT

MINNEAPOLIS, MN
DELOITTE CONSULTING LLP

January 2015 – November 2016

  • Worked with customers, end-users, technical analysts, and application designers to define data requirements
  • Designed and developed data requirements for 12 Business Intelligence (BI) applications
  • Developed, maintained, and oversaw 30 automated and scalable ETL/ELT data pipelines
  • Sourced, processed, validated, transformed, aggregated, and distributed data from 50 sources
  • Built cloud-based data solutions housing data from 50 data sources, resulting in a 32% increase in revenue
  • Optimized 12 non-performing databases, queries, and pipelines
  • Ensured timely access to data by 100% of applications
  • Documented data sources, data structures, data flows, and data infrastructure
  • Developed monitoring and alerting capabilities to ensure 100% of data pipelines were working
  • Developed high-performance data pipelines to support complex data integration
  • Collaborated with a 12-person team in developing 20 automation programs, shell scripts, and other utilities
  • Wrote and designed testable and efficient code according to industry and company best practices
  • Supported process flow analyses and ETL process redesigns, contributing 3 implemented plans
  • Documented customers’ requirements and provided cost/effort breakdowns
  • Created solution documentation framework to ensure 100% of errors were corrected and recorded
  • Leads the team in appropriate decision-making through solid judgment and the ability to analyze options and implications.
  • Investigate and develop approaches/solutions to address technical problems with project teams.

LEAD DATA ARCHITECT

BLOOMINGTON, MN
EMERSON PROCESS MANAGEMENT

January 2013 – January 2015

  • Implemented and maintained data architecture built around automated ingestion, data security, compliance, and governance.
  • Designed the infrastructure required for optimizing extraction and transformation of data using AWS to improve loads by 62%.
  • Liaised with 25 stakeholders, including the product, BI, and design teams, to assist with data-related technical challenges.
  • Mentored 9 data engineers and 10 other engineers and business leaders in all aspects of data management.
  • Drove the design, building, and launching of 4 new data models and data pipelines in production.
  • Responsible for 100% of all data quality across product verticals and related business areas.
  • Built 4 continuous data pipelines with fault-tolerant architecture
  • Improved and maintained > 40% of data infrastructures in AWS
  • Transferred data and events between ATG and Data Partners
  • Designed and documented 30 table schemas

DATA ENGINEER

EDEN PRAIRIE, MN
C.H. ROBINSON

December 2009 – December 2012

  • Build and maintained data pipelines using SQL, and SSIS
  • Extracted, transformed, and loaded data from various sources such as CSV files, XML files, JSON files, and SQL Server databases using Python scripts and SQL queries.
  • Created and maintained data warehouse tables and cubes using SQL Server Management Studio and SQL Server Data Tools.
  • Developed reports and dashboards using SSRS, and Power BI to visualize and present data to end users.
  • Translate business propositions into quantitative queries and collect/clean the necessary data.
  • Evaluate the workflow and increase the efficiency of data pipelines that process over 50 TB of data daily.
  • Review existing documentation on the ETL Architecture and underlying database data model.
  • Analyze and recommend SQL query optimizations and work with DBAs to obtain optimized queries.
  • Devise a process map and plan for release management of defects and change requests.
  • Coordinate the various ETL development activities with the offshore team members.
  • Participated in walkthroughs of requirements, specifications, database designs, ETL code, and test strategies
  • Created dimension hierarchies, level-based measures, and Time Series metrics and derived columns of business metrics
  • Generated Reports and Dashboards using Report features like Pivot tables, charts, column selectors, and view selectors.
  • Led initiative to improve quality throughout all phases of development.
  • Working with business users to train them on the new application/Dashboards and help in preparation for UAT.

DATA WAREHOUSE CONSULTANT

SAINT PAUL, MN
DIGITILITI INC.

December 2006 – December 2009

  • Created comprehensive and easy-to-understand Documentation of the complete implementation of the Customer Sales Business modules (Reports, Dashboards, Security and Star schema). 
  • Developed the Oracle BIEE Metadata repository (.rpd) model with for Sales; Marketing subject areas that involved creating Physical, BMM layers with aggregates, dimensions, hierarchies; time series functions, Presentation layer with catalogs; folders using Oracle BI Administration tool. 
  • Involved in developing the Incremental ETL process (Mappings; Mapplets) using various Transformation and Workflows with Sessions / Tasks. 
  • Implemented security by creating users; web groups, setting up External Table Authentication, creating Session Variables and Initialization Blocks and associating catalog objects and Web modules with the created groups. 
  • Analyzed and validated the business model in Administration Tool by performing consistency check, validating logical source tables, logical columns, and also validating repository level calculations done in the Business Model & Mapping layer.
  • Developed Interactive Dashboards using Reports with different Views, implementing pagination (Drill-Down, guided navigation, Pivot Table, Chart, Column Selector, dashboard and page prompts) using Oracle Presentation Services.
  • Worked on repository variables and session variables.
  • Used the security settings in the Console and Enterprise Manager to setup users, Groups and Access privileges.

I’ve gotten to work with some amazing brands!

Education

University OF Saint Thomas

MASTER OF SCIENCE IN SOFTWARE ENGINEERING

SAINT PAUL, MN

University of BUEA

BACHELOR OF SCIENCE

BUEA, CAMEROON

SKills

Skills

N

UX/UI Design

N

Product Strategy

N

Rapid Prototyping

N

Usability Testing

N

Information Architecture

Recent Work

Donec sollicitudin molestie

Vivamus suscipit tortor

Curabitur aliquet quam

Nulla quis lorem libero

References

Donec sollicitudin molestie malesuada. Cras ultricies ligula sed magna dictum porta. Vivamus suscipit tortor eget felis porttitor volutpat. Nulla porttitor accumsan tincidunt.

— Leonetta Lloyd

Nulla quis lorem ut libero malesuada feugiat. Curabitur non nulla sit amet nisl tempus convallis quis ac lectus. Pellentesque in ipsum id orci porta dapibus.

— Dave Blue

Sed porttitor lectus nibh. Curabitur aliquet quam id dui posuere blandit. Donec rutrum congue leo eget malesuada. Nulla quis lorem ut libero malesuada feugiat.

— Tiontay Carroll

Let’s Work Together