Missouri Manufacturing Jobs

Jobs.mo.gov mobile logo

Job Information

Wells Fargo Software Sr Engineer - Big Data Developer in Saint Louis, Missouri

Job Description

Important Note: During the application process, ensure your contact information (email and phone number) is up to date and upload your current resume when submitting your application for consideration. To participate in some selection activities you will need to respond to an invitation. The invitation can be sent by both email and text message. In order to receive text message invitations, your profile must include a mobile phone number designated as 'Personal Cell' or 'Cellular' in the contact information of your application.

At Wells Fargo, we want to satisfy our customers' financial needs and help them succeed financially. We're looking for talented people who will put our customers at the center of everything we do. Join our diverse and inclusive team where you'll feel valued and inspired to contribute your unique skills and experience.

Help us build a better Wells Fargo. It all begins with outstanding talent. It all begins with you.

Wells Fargo Technology is a team of more than 40,000 information technology and security professionals who help keep Wells Fargo at the forefront of America's diversified financial services companies. Employees execute an engineering-led IT strategy to deliver stable, secure, scalable and innovative services that provide Wells Fargo global customers ‘round-the-clock' banking access through in-store, online, ATM, and other channels. Wells Fargo Technology plays a critical role in the company's customer and employee experience, business and risk management transformation, and growth agenda.

This is senior technical position on the Big Data Messaging Service data application team. This team will support the ingestion, cleansing and parsing of Exchange and Skype data into the Enterprise Data Environment, landing information in the Enterprise Data Lake. The team will also support other corresponding enterprise initiatives.

The Enterprise Data Lake is a core enterprise capability that enables key initiatives that drives the Company's data transformation, analytics agenda, digitization of products and services, regulatory compliance, and etc. At the center of the Enterprise Data Lake is the big data platform comprised of a rich portfolio of open source products (Hadoop and related eco-system tools), COTS (commercially off-the-shelf) products, and in sourced developed capabilities.

The Software SR Engineer position will be primarily focused on engineering, development and support of the ingestion, cleansing and parsing of Exchange and Skype data into the Enterprise Data Lake. This position will be a subject matter expert on the process and underlying technology infrastructure. This candidate must have excellent, interpersonal, organizational and collaboration skills as technology teams primarily work virtually. As a Software SR Engineer you will provide direction and guidance to less experienced staff. Finally, candidates should be comfortable collaborating with non-technical financial business professionals troubleshooting issues and translating business requirements and writing clear and concise documentation.

Key skills and responsibilities include:

  • Design and build high performing and scalable Real Time Streaming Applications using Kafka, Apache Spark, HBase and object storage architecture.

  • Design high performing data models on big-data architecture as data services.

  • Provide technology leadership for data services that deliver Strategic Enterprise Management data.

  • Partner with Enterprise data teams such as Data Management & Insights and Enterprise Data Environment (Data Lake) and identify the best place to source the data.

  • Work with business analysts, development teams and project managers for requirements and business rules.

  • Work with Infrastructure Engineers and System Administrators as appropriate in designing the big-data infrastructure.

  • Support ongoing release management efforts for Development, QA and Production environments.

  • Utilizes a thorough understanding of available technology, tools, and existing designs.

  • Leverage knowledge of industry trends to build best in class technology to provide competitive advantage.

  • Acts as expert technical resource to programming staff in the program development, testing, and implementation process.

  • Ability to work independently with little day to day oversight.

  • Strong verbal, written, and interpersonal communication skills.

Required Qualifications

  • 7+ years of software engineering experience

  • 10 + years of Progress programming experience

  • 5 + years of experience with Agile tools

  • 5 + years of Hadoop Hortonworks experience

  • 10+ years of experience delivering complex enterprise wide information technology solutions

  • 7+ years of Java or Python experience

  • 3+ years of Kafka Platform experience, Confluent Platform experience, or a combination of both

  • 3+ years of Apache Spark design and development experience using Scala, Java, Python or Data Frames with Resilient Distributed Datasets (RDDs)

  • 3+ years of design and development experience with columnar databases using Parquet or ORC file formats on Hadoop

  • 5+ years of operational risk, conduct risk or compliance domain experience

  • 5+ years of analytics experience

  • 2+ years of experience developing and using RESTful Application Program Interface (API)

Desired Qualifications

  • An industry-standard technology certification

  • Strong verbal, written, and interpersonal communication skills

  • Experience in Hadoop ecosystem tools for real-time batch data ingestion, processing and provisioning such as Apache Flume, Apache Kafka, Apache Sqoop, Apache Flink, Apache Spark or Apache Storm

  • Knowledge and understanding of project management methodologies: used in waterfall or Agile development projects

  • Knowledge and understanding of DevOps principles

  • 10+ years of experience with one or more of the following programming languages: Python, Perl, or KornShell (KSH)

  • Ability to execute in a fast paced, high demand, environment while balancing multiple priorities

  • Ability to interact effectively and confidently with senior management

  • Knowledge and understanding of virtual environments

Other Desired Qualifications

  • 4+ years of experience in Hadoop administration, involving HBASE, Kafka, MapRDB, Spark

  • 1+ year of Cloud experience like AWS EMR or Azure

  • 2+ years of cloud computing engineering experience in one or a combination of the following: Amazon Web Services (AWS), Google Cloud Compute (GCC) or Azure Cloud Security

  • Experience designing and developing data analytics solutions using object data stores such as S3 and related Cloud experience.

  • A BS/BA degree or higher in information technology or equivalent relevant industry experience.

Job Expectations

  • Ability to travel up to 10% of the time

Salary Information

The salary range displayed below is based on a Full-time 40 hour a week schedule.

MN-Minneapolis: Min: $108,500 Mid: $155,000

AZ-Chandler: Min: $108,500 Mid: $155,000

NC-Charlotte: Min: $108,500 Mid: $155,000

Disclaimer

All offers for employment with Wells Fargo are contingent upon the candidate having successfully completed a criminal background check. Wells Fargo will consider qualified candidates with criminal histories in a manner consistent with the requirements of applicable local, state and Federal law, including Section 19 of the Federal Deposit Insurance Act.

Relevant military experience is considered for veterans and transitioning service men and women.

Wells Fargo is an Affirmative Action and Equal Opportunity Employer, Minority/Female/Disabled/Veteran/Gender Identity/Sexual Orientation.

Benefits Summary

Benefits

Visit https://www.wellsfargo.com/about/careers/benefits for benefits information.

Company: Wells Fargo

Req Number: 5564259-6

Updated: 2021-05-13 01:52:58.177 UTC

Location: Saint Louis,MO

DirectEmployers