Cloud Developer
Job Location
- Prague (or remotely)
Job Content
- Apply Cloud (Azure, AWS, GCP) computing skills and services to deploy various solutions
- Gather user & developer requirements to design, develop and implement tailored made applications
- CI/CD pipelines
- Troubleshoot production issues and coordinate with the development team to streamline code deployment
- Improve AI & ML solutions for our clients, engineering tools, systems, procedures and data security
Requirements:
- Typescript, JS
- Eventhub
- Sevices proceses (Lambda)
- Kubernetes platforms
- CI/CD, IAAC
- Databases, NoSQL, SQL
- AWS, Azure, cloud
- Linux, Bash
Nice to have
- DevOps knowledge
- Object Storeges (ARS2)
- ARM/Teraform
Big Data Infrastructure Engineer
Job Location
- Prague (or remotely)
Job Content
- Linux administration and security
- Participation in designing and deploying various Big Data platforms (mainly Cloudera distribution)
- Big Data platform operation and support, including 3rd party services (RDBMs, orchestration and automation tools etc.)
- Participation in End-to-end implementation of solutions supporting various business needs (e.g. docker based GPU environment for data scientists)
- Participation in DevOps teams – advising with solution architecture from platform and technological perspective
Requirements:
- Experience with Linux administration and security
- Experience with technical analysis, provisioning and troubleshooting solution architecture
- Experience with Hadoop based (Cloudera, Hortonworks, MapR) platform operation and maintenance
- Advanced English
Nice to have
- Experience with orchestration and automation tools Airflow, Ansible
- Experience with Kubernetes or OpenShift
- Python
- Hands-on experience with cloudservices like Azure or AWS
Big Data Developer
Job Location
- Prague (or remotely)
Job Content
- Be a part of an agile international team working on launching one of the biggest Data Lakes in the world
- Contribute and implement high performance data pipelines for distributed systems
- Build, deploy, operate and maintain applications on analytical platforms
- Development of data related applications
Requirements
- Development experience and proficiency in Scala
- Experience with Big Data tools like Spark, Kafka or Elasticsearch
- Good SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases (Hive, Impala, Kudu a plus)
- Advanced English
- Knowledge of CI/CD and Unit Testing
- Experience with building and optimising data pipelines
Nice to have
- Experience with Docker or Kubernetes
- Python
- Hands-on experience with clouds services like Azure or AWS