A global commodities trading firm is seeking a Data Engineer / Analyst to support its growing analytics and technology infrastructure. This role is ideal for a technically skilled professional who enjoys building data tools, optimizing pipelines, and directly collaborating with trading desks and business users. You’ll contribute to the development of data-driven systems that enable timely decision-making across the organization.
This is a front-facing position that bridges data engineering with business intelligence and requires strong communication, analytical thinking, and independent problem-solving ability.
Key Responsibilities:
- Participate in the design and enhancement of internal data and reporting solutions.
- Develop and maintain ETL/ELT pipelines, data management tools, and dashboards for internal and external datasets.
- Improve existing data models and workflows to enhance usability, reliability, and speed.
- Expand coverage of market and public data sources and ensure ongoing data freshness and integrity.
- Identify and resolve issues with incomplete, inaccurate, or outdated data.
- Work iteratively in an agile, release-focused environment, maintaining continuous feedback loops with business users.
- Build APIs and integration points with external data services using modern formats (e.g., JSON, XML, CSV, Parquet).
- Apply best practices in software development, including source control, testing, and documentation.
Preferred Technical Experience:
- Scripting & Data Processing: Proficiency in Python or R, with knowledge of data access/manipulation libraries (e.g., Pandas, NumPy).
- Data Engineering: Experience designing and querying relational databases, and familiarity with MMP or distributed systems is a plus.
- Cloud Tools: Exposure to cloud-based data storage and processing technologies.
- Reporting & Visualization: Proficiency in tools like Power BI, Tableau, or open-source charting libraries (e.g., Plotly, Bokeh).
- Web & API Skills: Understanding of RESTful APIs, authentication protocols, and data integration practices.
- Systems & Tools: Experience in both Windows and Linux environments; strong command of Excel; comfortable working with source control and DevOps pipelines.
Qualifications:
- Bachelor’s degree in Computer Science, Data Science, Engineering, or a related technical field.
- At least 2 years of professional experience working with data pipelines, analytics systems, or market data platforms.
- Experience working with large and diverse datasets in a time-sensitive, data-driven environment.
- Strong English communication skills, both written and verbal.
- Ability to operate independently and manage competing priorities with minimal supervision.