About SymphonyAI
SymphonyAI is a leading enterprise AI solutions provider helping retailers and manufacturers optimize business operations through advanced analytics, planning, and automation solutions. Our products support global organizations in improving supply chain efficiency, inventory performance, forecasting accuracy, and customer satisfaction. We are committed to delivering measurable outcomes for our clients through innovative technology, deep domain expertise, and strong customer partnerships.
Job Description
About the role:
- Support the development and maintenance of scalable data pipelines for a Retail Data Lake solution usingAzure Data Factory (ADF),Python, andSQL.
- Assist in ingesting, transforming, and loading data from multiple sources into Snowflake and SQL Server environments.
- Work with structured and semi-structured retail datasets to ensure accurate, consistent, and high-quality data delivery.
- Participate in designing and building ETL workflows under the guidance of senior data engineers.
- Support data validation and reconciliation activities to ensure data integrity across pipeline stages and target systems.
- Analyse and understand existing data transformation logic and assist in documenting pipeline behaviour and business rules.
- Collaborate with Professional Services teams to support client data onboarding and integration into the Retail Data Lake platform.
- Assist in troubleshooting pipeline failures, identifying root causes, and supporting resolution efforts.
- Contribute to improving data quality checks, monitoring, and automation within data workflows.
- Work with cross-functional teams to translate business requirements into technical data solutions.
- Gain exposure to cloud-based data engineering practices using the Microsoft Azure ecosystem and modern data warehouse technologies.
About you
- Basic understanding of data engineering concepts (ETL/ELT, data pipelines, data warehousing)
- Familiarity with SQL (querying, joins, aggregations, data validation)
- Exposure to Python for data processing or scripting
- Awareness of cloud data platforms, ideally Microsoft Azure
- Interest or experience with Azure Data Factory (ADF) or similar orchestration tools
- Understanding of relational databases such as SQL Server
- Exposure to or willingness to learn Snowflake or modern cloud data warehouses
- Strong analytical mindset with attention to data quality and accuracy
- Ability to understand and interpret business logic and transformation rules
- Good problem-solving skills and willingness to troubleshoot data issues
- Strong communication skills and ability to work in a team environment
- Comfortable collaborating with technical and non-technical stakeholders (e.g., Professional Services / client teams)
- Eagerness to learn modern data engineering tools and cloud technologies
- Interest in or exposure to AI-assisted coding tools (e.g., using AI to support Python/SQL development or debugging)
- Familiarity with or curiosity about prompting techniques for interacting with AI tools effectively
- Interest in automation using AI or scripting to reduce manual data tasks and improve efficiency
- Willingness to explore how AI can be used to improve data quality checks, monitoring, and workflow automation
- Interest in learning how AI integrates into modern data engineering pipelines and cloud environments
About Us
WHY SYMPHONYAI ?
- Start-up spirit within a large international company
- Exciting technology including Artificial Intelligence
- Communication, Innovation and Collaboration are among our watchwords
- We will support and explore your ideas: if you can do it better, we want you to show us!
- Our teams comprise incredibly talented and passionate people who love what they do