About Dynamico AI:
At Dynamico AI, we’re at the forefront of bringing AI solutions to the business world, focusing on agility, scalability, and compliance. Our mission is to democratize AI, enabling companies to start their AI journey with impactful, manageable projects and scale effectively. We ensure that our AI solutions are not only innovative but also secure and fully compliant with current regulations. Join our mission to revolutionize businesses with cutting-edge AI technology.
Role Overview:
We are looking for a skilled and experienced Data Engineer to join our dynamic team. As a Data Engineer, you will have the opportunity to play a key role in building data pipelines, setting up Retrieval Augmented Generation (RAG), and coding AI solutions. You will be responsible for ensuring the efficient management and processing of data, as well as the development of scalable and robust systems to support AI-driven applications.
Key Responsibilities:
- Design, build, and maintain scalable data pipelines to process and transform large datasets.
- Implement and optimize data storage solutions to enable efficient retrieval and analysis of data.
- Set up Retrieval Augmented Generation (RAG) systems to enhance the quality and relevance of AI-generated content.
- Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
- Develop and implement AI solutions, leveraging Python and other relevant programming languages.
- Ensure the integrity, reliability, and security of data throughout the data engineering process.
- Monitor and optimize data infrastructure performance to meet the needs of AI applications.
- Stay up-to-date with the latest advancements in data engineering technologies and techniques.
Requirements:
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- Minimum of 3 years of professional experience as a Data Engineer or in a similar role.
- Strong proficiency in Python programming language for data manipulation and analysis.
- Experience building and optimizing data pipelines and ETL processes.
- Familiarity with Retrieval Augmented Generation (RAG) techniques and frameworks.
- Knowledge of databases (e.g., SQL, NoSQL) and data warehousing concepts.
- Proficiency in working with cloud platforms such as Azure or AWS.
- Strong problem-solving and analytical skills, with the ability to handle complex data engineering challenges.
- Excellent communication and collaboration skills, with the ability to work effectively in a team environment.
What we offer:
- The opportunity to work remotely.
- A supportive, collaborative environment where you can grow your skills in AI and its applications in the field of economics.
- Exposure to real-world projects with cutting-edge AI technology.
- A chance to contribute to innovative solutions that have a tangible impact on business strategy and economic analysis.