Booz Allen Hamilton Data Engineer, Lead in Fayetteville, North Carolina

Data Engineer, LeadinFayetteville, NCatBooz Allen Hamilton Inc.

Date Posted:5/17/2018

ApplyApplyNot ready to Apply?

×Join Our Talent Network

Join us. The world can’t wait.

The Booz Allen Talent Network is your opportunity hub—a chance to learn about what we do, how we do it, and how you can help.

Tell us what interests you. We’ll send you:

  • Job alerts that match your passions

  • Details about relevant upcoming events

  • Information about our work in the news

  • And more

Empower change with us. Start by joining the Network.

Privacy Policy

Terms and Conditions

  • {{ err }}

Thanks for joining our Talent Network, {{vm.userName}}

This service is currently unavailable. Please try again at a later time.

*Required

By joining our Talent Network you have not officially applied to a position.

By joining our Talent Network you have not officially applied to a position.

Thanks for joining our Talent Network, {{ vm.userName }}

By joining our Talent Network you have not officially applied to a position. To apply for this position, please click the continue button.

Continue

Redirect in {{vm.counter}}

Share With:

Job Snapshot

  • Employee Type:

Full-Time

  • Location:

Fayetteville, NC

  • Job Type:

Engineering

  • Experience:

Not Specified

  • Date Posted:

5/17/2018

About Us

At Booz Allen, we harness our collective ingenuity to solve our clients' toughest management and technology problems. We work with governments, Fortune 500 corporations, and not-for-profits around the globe, in industries ranging from defense to health, energy to international development. We believe there is no product, code, or strategy that can create progress-only people can. That's why for more than 100 years we've empowered our team: over 24,000 dreamers, drivers, and doers who work together to change the world.

Job DescriptionJob Number: R0028987

Data Engineer, Lead

Key Role:

Integrate, manipulate, and manage vast amounts of data into the next generation of Big Data analytic solutions for our clients. Combine engineering expertise with innovation to deliver robust solutions that serves our clients and stands apart from our competitors. Interact with a multi-disciplinary team of analysts, data scientists, developers, and users to comprehend data requirements to develop a robust data processing pipeline that will ingest, manipulate, normalize, and expose potentially billions of records per day to support advanced analytics. Perform programming, display clean coding habits, pay strict attention to detail, and focus on quality. Collaborate with and contribute to open source software communities, ensuring quality delivery of software through thorough testing and reviews. Build and launch new data models that provide intuitive analytics to our customers and design, build, and launch efficient and reliable data pipelines to move large and small amounts of data to our data platform. Design and develop new systems and tools to enable folks to consume and comprehend data faster and identify new technologies to be injected into the platform to support advanced data integration and analysis

Basic Qualifications:

-7+ years of experience with dimensional data modeling and schema design in Data Warehouses

-7+ years of experience with software design, implementation, and test

-5+ years of experience with custom or structured ETL design, implementation, and maintenance

-2+ years of experience with leading technical delivery teams

-Experience in working with either a Map Reduce or similar system on any size and scale

-Experience with batch and streaming frameworks, including Storm, NiFi, Apex, or Flink

-Experience with storage components, including Accumulo, HBase, or Hive

-Experience with search technologies, including Solr and Elasticsearch

-Experience with developing service APIs for external consumption

-Knowledge of RESTful services design, development, and testing

-Ability to display clean coding habits, pay strict attention to details, and focus on quality

-Ability to learn technical concepts quickly and communicate with multiple functional groups

-TS/SCI clearance

-BS degree

Additional Qualifications:

-Experience with multiple data modeling concepts, including XML or JSON

-Experience with RDBMS data stores, including Oracle or MySQL

-Experience with large-scale, distributed systems design and development

-Experience with machine learning and deep learning concepts and algorithms

-Experience with DevOps methods and tools, including Jenkins, Git, SVN, Docker, or Vagrant

-Knowledge of scaling, performance, and scheduling

-Knowledge of at least one scripting language, including Python, Node, Ruby, or Bash

-Knowledge of system architecture, including process, memory, storage, and networking management preferred

-Possession of analytical and problem-solving skills

-BS degree in CS or a related field

-Security+ or CISSP Certification

Clearance:

Applicants selected will be subject to a security investigation and may need to meet eligibility requirements for access to classified information; TS/SCI clearance is required.

We’re an EOE that empowers our people—no matter their race, color, religion, sex, gender identity, sexual orientation, national origin, disability, or veteran status—to fearlessly drive change.