Актуальные заказы по Database Developer

SRE/DevOps Engineer

Офис
Удаленно
Full-time
Постоянная работа

Looking for a SRE/DevOps Engineer to work on a trading product.


Requirements:

  • Strong knowledge of Linux (or any other Unix);
  • Experience with DevOps tools (Docker, Jenkins, Gitlab-CI, Ansible, Terraform, Chef, Puppet etc.);
  • Understanding of principles of web-servers (ex. Nginx);
  • Understanding of HTTP Stack;
  • Understanding CI/CD;
  • Version control systems: Git;
  • Knowledge of SQL;
  • Basic knowledge of databases: PostgreSQL/MySQL;
  • Basic knowledge of networks.


Tasks/responsibilities:

  • Monitor the operation of reporting systems in production, solve current problems and work on improving the operation of systems (find errors in logs and performance loss, detect problems in the interaction of services, analyze application performance metrics and system metrics of host resources on which the application is deployed and create tasks for development team to fix the problems);
  • Incident resolution, root cause analysis, reporting, collaboration in problem;
  • solving including problems related to other teams;
  • Configure and adjust monitoring of services;
  • Build, release and configuration management of production systems;
  • Deploying, automating, maintaining and managing AWS cloud-based production system, to ensure the availability, performance, scalability and security of production systems;
  • Managing dev QA and production environments.


Tech stack:

  • *nix family OS;
  • AWS;
  • Kubernetes;
  • ELK;
  • Zabbix, Grafana, Dynatrace;
  • Git;
  • Jenkins, Gitlab CI;
  • Terragrunt;
  • PostgreSQL;
  • Kafka;
  • Consul, Vault.


Relocation to Montenegro.



Data engineer

Офис
Удаленно
Full-time

Who are we?

The fintech startup working on the first large-scale e-wallet in region. We aim to provide people with simple and convenient alternative to cash.

Role Description

Data engineer will be responsible for improving and expanding our data capabilities:

· You will design and develop data applications in one of languages: Python, Scala, SQL

· Develop, customize and manage data integration tools, databases, warehouses, and analytical systems

· Design, build and maintain scalable data models that are clean, tested, and well documented

· Work closely with product teams and other stakeholders to desing and build data marts providing reliable and accurate data in a fast way

· Handle all our data pipelines and contribute towards our data strategy and its execution


Experience, Competencies and Skills Required

· 2+ years of experience as a Data Engineer building data pipelines and analytical data models

· At least 2 years of hands-on experience with Python, SQL. Knowledge of Scala is a plus

· Strong competencies in algorithms and software architecture

· Strong experience in real-time data processing and data ingestion

· At least 2 years of hands-on experience with Big Data systems like Hadoop, Spark

· Experience building data infrastructure using at least one major cloud provider, preferably AWS

· Knowledge of Terraform is a plus

· Knowledge of Scrum, Agile

· Advanced English, good Russian

Full-stack engineer

Удаленно
Full-time

Project

The (further) development of software for the purpose of iterative experiment design, data visualisation and connection to the Alb. database.

 

Backend Components:

Therefore, the software development also requires the design and development connectivity to the Alb. database via REST API as well as the frontend with user interface. The first component of API development is required to be developed using Python and TypeScript as well as serverless technologies based on Azure. Developer will be responsible for creating web application backend components from scratch, which will be exposed as REST Web APIs to frontend. In this role, you should be able to write functional code, which handles data from external REST APIs as well as from NoSQL database.

 

Frontend:

The second frontend component requires proficiency in AngularJS and TypeScript with the ability to make code design decisions. In addition, the developer has Python development skills, experience in API development and testing as well as in data visualization frameworks like D3.js.

 

Backlog items will be assigned in Azure DevOps containing business requirements and acceptance criteria.


Tasks:

-       Affiliation of scalable and responsive cloud-based Web application using state-of-the-art frontend engineering practices by:

o  Designing a technical concept, coding, testing and documentation according to the backlog item based on TypeScript and Python [BE]

o  Designing a technical concept, coding, testing and documentation according to the backlog item based on AngularJS and TypeScript [FE]

o  Translation of UI/UX design wireframes to actual code, by developing web user interface components and related tests [FE]

o  Designing and testing of APIs [FE]

o  Implement serverless technologies based on Azure, Azure Cosmos DB as well as Azure Event Hub [BE]

o  Creating web application backend components from scratch, which will be exposed as REST Web APIs to frontend [BE]

o  Write functional code, which handles data from external REST APIs and from NoSQL database [BE]

o  Implementing JWT-based authentication and access token handling [BE]

o  Performing visualization within data visualization frameworks like D3.js [FE]

o  Developing CI/CD build pipelines for the developed web components and takes actions to keep code/systems stable and efficient [BE, FE]

o  Testing including unit, integration, and performance tests [BE, FE]

-       Identify, design, and implement product improvements for the project related tasks according to the discussed backlog [BE, FE]

-       Documentation of technical implementation and related processes in Azure DevOps. Company will validate and approve it [BE, FE]

Backend Ruby инженер

Удаленно
Full-time
Проектная занятость
Project: Senior Backend Software Engineers build the core of the business logic services. Internal tools, partner focused APIs and consumer oriented apps all rely on these services. Tasks: - Design, build, and maintain APIs, services, and systems across the businesses. - Debug production issues across services and multiple levels of the stack. - Work with engineers across the company to build new features. - Improve engineering standards, tooling, and processes. Requirements: - Experience designing and building APIs. - Understanding of the value of automated testing as part of the implementation, maintenance and improvement of our systems. Be ready to promote these values across the organization. - To know the value of good code design practices for speeding up development, and extending our systems. - Good discipline when it comes to the engineering process. - Excel in multicultural and multidisciplinary environments. - Skills to shape rational technical deliverables from business requirements. - Our team uses the following tools, but we do not expect you to be an expert or to have experience with all of them. Stack: - Ruby on Rails and/or knowledge in any language that allows or favors OOD is welcome. - PostgreSQL, MySQL, Redis, DynamoDB, S3 - knowledge in other database or storage solutions is more than welcome. - Our systems are deployed and maintained mostly on AWS - experience with other PaaS providers would be seen as a plus. - We also have some responsibility on some services that use ReactJS, Node and Java. Other skills that would be an advantage: - Knowledge of Android SDK, ReactJS and/or ReactNative. - Proficiency in Java, Spring and Spring boot, Kotlin or Javascript. - Experience with UI development. - Good command of algorithms, data structures and design patterns. - Advanced English.

Инженер по обработке данных

Удаленно
Full-time
Проектная занятость
Ищем старших инженеров по обработке данных для крупной телерадиовещательной компании в Германии. Сфера услуг: - Реализация конвейеров данных для подготовки, предоставления и версионирования данных для обучения моделей; - Консультирование Data Scientists по вопросам разработки моделей машинного обучения, особенно в части продуктивной работы этих моделей; - Разработка и реализация микросервисов для предоставления моделей через REST API, включая функции мониторинга продуктивной работы этих моделей; - Развертывание микросервисов в продуктивной облачной среде с учетом требований высокой доступности. Используемый стек технологий: - Google Cloud Platform, Terraform, GitLab; - Kubernetes, Docker, Airflow, MLflow; - BigQuery, BigTable; - Python, Pyspark, SQL; - REST API. Продолжительность: 2 месяца + возможность продления.