Senior Big Data Engineer Senior Big Data Engineer …

BNY Mellon
in New York, NY
Permanent, Full time
Last application, 29 Oct 20
BNY Mellon
in New York, NY
Permanent, Full time
Last application, 29 Oct 20
Senior Big Data Engineer

Clearing, Markets & Issuer ServicesTechnology (CMIST) is responsible for applicationdevelopment and support for critical business systems including Repo Edge(collateral management), Enterprise Payment Hub (multi-currency paymentprocessing), and Broker Dealer Clearance (securities clearing), along withapproximately 350 other applications used by the following high-prioritybusiness services and their clients.

Clearance and Collateral Technology(CCT) within CMIST builds clearance andcollateral management platforms to service BNY Mellon's broker-dealer clientsand is the sole provider for Government Securities Clearing Services.Considering the core nature of the business and the dominant market share,high-performance and resiliency are key pillars of the technology architecture.The group is currently focused on Repo Modernization, which will pave the wayto the Future of Collateral for the business - providing strengthened platformresiliency and new trade capabilities within a unified system. All CCTapplications are built for real-time operations aided by a data warehouse forreporting and business insights.

This is a 100% hands-on role, anddoesn't involve any team management responsibilities
  • Develop, deploy, and maintain scalable/optimal data pipeline and the supporting data platforms and infrastructure
  • Standardize all data pipeline processes, develop reusable modules (auditing, logging, traceability), integrate with DevOps pipeline
  • Recommend improvements to existing data integration architecture and implementation of standard frameworks that automate manual processes, improve overall scalability, increase reliability, and improve quality
  • Define the right data models working with ML engineers, data scientists, and reporting/analytics developers - optimizing data storage, improving data accessibility, and making them re-usable

Serves as the technical expert in the design, development, implementation and maintenance of data, reporting and database technologies and tools. Consults with businesses to resolve highly complex data issues. Formulates standards, processes and procedures to align with the data architecture/management for major application projects. Leads the creation and evolution of the strategy and direction of database design, business intelligence and analytics. Leads development of complex database designs in multiple parallel projects through in-depth understanding of business needs and functionalities. Consults with database administration and client areas and provides solutions in resolving highly complex issues during the translation to a physical database design. Provides expertise in the most complex processes of integrating data across existing and modified applications. Provides innovative direction and guidance on reports and ensures recommendations are aligned with user needs and capabilities. Stays abreast of emerging technologies and identifies potential use of new and existing technology within lines of business by participating in industry-wide conferences and research and by having in-depth knowledge of various business areas. Contributes to the achievement of Data Modeling/Warehousing objectives.

  • BS degree (Computer Science, Math, Physics, Engineering). MS / PhD preferred.
  • 12+ years of experience in software development required
  • 5+ years overall hands-on experience in building and maintaining production grade data pipelines, developed in Python or Java, that support critical business functions, reporting, and analytics
  • 3+ years overall hands-on experience in Python
  • Very strong SQL skills with multiple years of experience in developing and maintaining complex SQLs across different standard and big data technologies (preferred)
  • Extensive experience with big data platforms, technologies, and tools and technologies (Cloudera Hadoop, Spark, Hive, Impala, Dremio, Google BigQuery, ECS, etc.) - understanding their architecture, infrastructure requirements, troubleshooting tricks, integration with other platforms, and automation of processes
  • Should have experience in handling large volume/size (billion / > 100 TB), wide data sets (> 200) and velocity (
  • Hands-on experience around compute and storage in any one of the public cloud platforms such as Google Cloud (preferred), AWS, or Azure is highly preferred.
  • Experience in the securities or financial services industry is a plus.

BNY Mellon is an Equal Employment Opportunity/Affirmative Action Employer.
Minorities/Females/Individuals With Disabilities/Protected Veterans.

Our ambition is to build the best global team - one that is representative and inclusive of the diverse talent, clients and communities we work with and serve - and to empower our team to do their best work. We support wellbeing and a balanced life, and offer a range of family-friendly, inclusive employment policies and employee forums.

Primary Location: United States-New York-New York
Internal Jobcode: 96070
Job: Information Technology
Organization: Clearing Markets ISS Svcs Tech-HR16624
Requisition Number: 2009969
BNY  Mellon logo
More Jobs Like This
See more jobs