
Galileo Financial Technologies
22 days ago

Employee Applicant Privacy Notice
Who we are:
Welcoming, collaborative and having the opportunity to make an impact - is how our employees describe working here. Galileo is a financial technology company that provides innovative and revolutionary software products and services that power some of the worlds largest Fintechs. We are the only payments innovator that applies tech and engineering capabilities to empower Fintechs and financial institutions to unleash their full creativity to achieve their most inspired goals. Galileo leads its industry with superior fraud detection, security, decision-making analytics and regulatory compliance functionality combined with customized, responsive and flexible programs to accelerate the success of all payments companies and solve tomorrows payments challenges today. We hire energetic and creative employees while providing them the opportunity to excel in their careers and make a difference for our clients. Learn more about us and why we work here at https://www.galileo-ft.com/working-at-galileo.
Team:
SoFi is seeking an experienced and motivated Senior Data Engineer to drive high standard technical solutions for the Data Products team within the SIPS (Spend, Invest, Protect, Save) division - supporting all SoFi Financial Services. The mission of the SIPS Data Engineering team is to support Data Engineering and reporting for SoFi’s Financial Services products. As a technical leader you will lead the vision and strategy to build foundational and critical data models which are highly leveraged across SoFi for analytical, reporting, and machine learning use-cases. Our goal is to empower consumers to make data driven decisions and effectively measure their results by providing high quality, high availability data, and democratized data.
Role:
A talented, enthusiastic, detail-oriented, and experienced Data Engineer who knows how to take on big data challenges in an agile way. This includes big data design and analysis, data modeling, and development, deployment, and operations of big data pipelines. Leads development of some of the most critical data pipelines and data sets, and expands self-service data knowledge and capabilities. This role requires you to live at the cross section of data and engineering. You should have a deep understanding of data, analytical techniques, and how to connect insights to the business, and you have practical experience in insisting on the highest standards on operations in ETL and big data pipelines.
What you’ll do:
- Design and develop robust data models and pipelines to support data ingestion, processing, storage, and retrieval. Evaluate and select appropriate technologies, frameworks, and tools to build scalable and reliable data infrastructure.
- Optimize data engineering systems and processes to handle large-scale data sets efficiently. Design solutions that can scale horizontally and vertically.
- Collaborate with cross-functional teams, such as data scientists, software engineers, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. Effectively communicate complex technical concepts to non-technical stakeholders.
- Optimize data engineering systems and processes to handle large-scale data sets efficiently. Design solutions that can scale horizontally and vertically,
- Enforce data governance policies and practices to maintain data integrity, security, and compliance with relevant regulations. Collaborate with data governance and security teams to implement robust data protection mechanisms and access controls.
What you’ll need:
- A bachelors degree in Computer Science, Data Science, Engineering, or a related field;
- 5+ years of experience in data engineering and analytics technical strategy.
- Proficiency in data engineering tech stack; Snowflake / PostgreSQL / Python / SQL / GitLab / AWS / Airflow/ DBT and others..
- Proficiency in relational database platforms and cloud database platforms such as Snowflake, Redshift, or GCP
- Strong in Python and/or another data centric language.
- Thorough knowledge of data modeling, database design, data architecture principles, and data operations.
- Strong analytical and problem-solving abilities, with the capability to simplify complex issues into actionable plans.
- Experience in the Fintech industry is advantageous.