About Dynamico AI:
At Dynamico AI, we’re at the forefront of bringing AI solutions to the business world, focusing on agility, scalability, and compliance. Our mission is to democratise AI, enabling companies to start their AI journey with impactful, manageable projects and scale effectively. We ensure that our AI solutions are not only innovative but also secure and fully compliant with current regulations. Join our mission to revolutionise businesses with cutting-edge AI technology.
Role Overview:
We are seeking a experienced DevOps Engineer to join our dynamic team in Poznań, Poland. In this role, you will play a crucial role in developing, operating and maintaining our infrastructure and deployment processes, ensuring the smooth operation and scalability of our AI-driven applications. As a DevOps Engineer, you will work closely with our development and operations teams to automate and optimise our software delivery pipeline, monitor system performance, and implement infrastructure as code. If you are passionate about continuous integration, cloud technologies, and ensuring the reliability and security of software products, we want to hear from you!
Key Responsibilities:
- Design, implement, and maintain our infrastructure using modern DevOps practices.
- Collaborate with development and operations teams to automate deployment processes and ensure smooth software release cycles.
- Monitor system performance, troubleshoot issues, and optimise scalability and availability.
- Implement and manage infrastructure as code using tools such as Terraform, Azure ARM/Bicep and CloudFormation.
- Ensure the security of our systems by implementing security best practices and configuring appropriate access controls.
- Collaborate with cross-functional teams to define and implement Github CI/CD pipelines.
Requirements:
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- Minimum of 3 years of professional experience as a DevOps Engineer or in a similar role.
- Proficiency in Kubernetes system administration and troubleshooting.
- Strong understanding of cloud-based infrastructure, preferably AWS or Azure.
- Solid scripting skills (Python) for automation and configuration management.
- Familiarity with CI/CD tools and practices, such as GitHub CI/CD.
- Strong knowledge of infrastructure as code using tools like Terraform or Arm/Bicep.
- Knowledge of monitoring and logging tools such as DataDog, Prometheus, or Grafana.
- Good problem-solving and communication skills (English and Polish), with the ability to work effectively in a team.
What we offer:
- The opportunity to work remotely.
- A supportive, collaborative environment where you can grow your skills in AI and its applications in the field of economics.
- Exposure to real-world projects with cutting-edge AI technology.
- A chance to contribute to innovative solutions that have a tangible impact on business strategy and economic analysis.