Share this Job

Sr Data Pipeline Engineer

Date:  Nov 18, 2022
Location: 

Remote, MA, US, Remote

Onsite or Remote:  Remote
Company Name:  EBSCO Information Services

EBSCO Information Services (EIS) provides a complete and optimized research solution comprised of e-journals, e-books, and research databases - all combined with the most powerful discovery service to support the information needs and maximize the research experience of our end-users. Headquartered in Ipswich, MA, EIS employs more than 2,700 people worldwide, most now working hybrid or remotely. We are the leader in our field due to our cutting-edge technology, forward-thinking philosophy, and outstanding team. EIS is a company that will motivate you, inspire you, and allow you to grow. Our mission is to transform lives by providing relevant and reliable information when, where, and how people need it. We are looking for bright and creative individuals whose unique differences will allow us to achieve this inclusive mission around the world.

 

 

Mission

The Always Be Coding (ABC) team is comprised of Library Service Engineers (LSEs), who are sales engineers, technology consultants, and software developers who are dedicated to improving internal EBSCO processes & systems, and the customer experience of EBSCO products.

 

EBSCO FOLIO Consulting Services is a sibling team and an internal customer of ABC. Its Implementation Consultants (ICs) supports a vibrant community-enabled ecosystem of innovation, openness and choice with solutions that combine the open-source FOLIO with EBSCO Data Services, and Professional Services.

 

We are looking for a highly motivated programmer to join the ABC team. This position, Senior Data Pipeline Engineer, will be responsible for designing and building a data pipeline that combines and manages the current separate scripts, managed by the ICs, that convert and load data.

 

The primary tooling will be Apache Airflow, Jenkins, and/or other industry-standard pipeline & automation systems. Existing stand-alone jobs are primarily written in Python. Some adjacent services use modern JavaScript and/or TypeScript; specifically: the Node.js runtime environment, Jest testing framework, and React library. Other specific tools and workflows include the Visual Studio Code IDE, the Git version control system using the GitHub Flow workflow, the Jenkins CI/CD pipeline with the SonarQube code analyzer, individual Amazon Web Services, all under the Agile development model.

The successful candidate will be personable and generous in sharing knowledge with their colleagues, and creative and productive when developing new apps for our internal teams and enhancements for our products and services.

 

Primary Responsibilities

  • Design, build, test, and deploy a data pipeline using standard tools, such as Apache Airflow, Jenkins, and various AWS services.
  • Coordinate with other code authors to implement their Python scripts directly, or their desired functionality indirectly, in the pipeline.
  • Maintain and improve the data pipeline as new requirements arise.
  • Serve as a technical resource for ABC and other teams on appropriate subjects, e.g. pipelines, tooling, and development at scale.
  • Participate in, and sometimes lead, team ceremonies: standup, planning, retrospective, backlog refinement, stakeholder meetings, and demos.
  • Attend and participate in larger company sync-ups and presentations to stay current with and contribute to company procedures and priorities.
  • Create and manage feature demos, prototypes, and documentation.
  • Establish strong relationships with internal teams and specific customer staff.
  • Role-Based Competencies
  • Continuous Improvement: Continually focused on improving the responsiveness and quality of the solutions delivered.
  • Flexibility/Adaptability: Adjusts quickly to changing priorities and conditions. Copes effectively with complexity and change.
  • Intelligence: Learns quickly. Demonstrates ability to quickly and proficiently understand and absorb new information.
  • Teamwork: Reaches out to peers and cooperates with supervisors to establish an overall collaborative working relationship.

 

Required Qualifications

  • Minimum five (5) years of progressive experience in production workflow management and automation, e.g., CI/CD, infrastructure as code, and version control.
  • Ability to multi-task and work independently while maintaining team involvement, organized and detail oriented, strong problem solving and analytical skills.

 

Preferred Qualifications

  • Bachelor’s degree (or higher) in Computer Science, Library and Information Science, or other discipline combined with equivalent coursework and experience.
  • Experience building Airflow DAGs and/or Jenkins automations.
  • Experience configuring and using Amazon Web Services (AWS), e.g., DynamoDB tables, ElastiCache Redis caches, and Lambda functions; especially with CloudFormation templates.
  • Experience developing and debugging in Python.
  • Experience with library software data standards, like MARC and/or BIBFRAME.

 

Target Annual Compensation Range: 88,550 - 126,500

The actual salary offer will carefully consider a wide range of factors including your skills, qualifications, education, training, and experience, as well as the position’s work location. EBSCO provides a generous benefits program including medical, dental, vision, life and disability insurance, flexible spending accounts, a retirement savings plan, paid parental leave, holidays and paid time off (PTO), as well as tuition reimbursement. View more about EBSCO’s benefits here: https://www.ebsco.com/about/benefits

 

 

 

We are an equal opportunity employer and comply with all applicable federal, state, and local fair employment practices laws. We strictly prohibit and do not tolerate discrimination against employees, applicants, or any other covered persons because of race, color, sex, pregnancy status, age, national origin or ancestry, ethnicity, religion, creed, sexual orientation, gender identity, status as a veteran, and basis of disability or any other federal, state or local protected class. This policy applies to all terms and conditions of employment, including, but not limited to, hiring, training, promotion, discipline, compensation, benefits, and termination of employment. We comply with the Americans with Disabilities Act (ADA), as amended by the ADA Amendments Act, and all applicable state or local law.


Job Segment: Testing, Database, Computer Science, Senior Product Manager, Engineer, Technology, Operations, Engineering