نشان کن
کد آگهی: KP2619681450

Data Engineer

کاریزما
در تهران - آرژانتین
در وبسایت جاب ویژن  (3 روز پیش)
اطلاعات شغل:
نوع همکاری:  تمام‌وقت
ساعت کاری:  شنبه تا چهارشنبه
متن کامل آگهی:

About the Role

We are looking for a highly skilled Data Engineer to join our Data Platform team.
In this role, you will design, build, and operate scalable data pipelines, perform complex data integrations between internal and external systems, and create analytical datasets powering BI, portfolio analytics, real-time applications, and machine-learning workloads.

This position is fully on-premise with deep involvement in system integration, ETL/ELT, streaming, and data modeling.


Key Responsibilities

1. Data Pipelines & Processing

  • Design, implement, and optimize batch and real-time data pipelines using Python, Kafka, NiFi, SQL, and distributed databases.
  • Build end-to-end ETL/ELT flows for financial and operational systems including OMS, custody (CSD), market data, user events, transactional systems, and portfolio snapshots.
  • Integrate, clean, normalize, and enrich datasets from mixed sources (databases, APIs, Kafka topics, files).

2. System Integration (Core Requirement)

You will be responsible for building and maintaining integration pipelines across multiple platforms inside Charisma:

CRM & Marketing Automation Integrations

  • Build and maintain integrations between Dynamics 365 CRM and other internal systems (OMS, market data, user data).
  • Sync customer segments, activities, and events between systems.
  • Build two-way integrations with Mautic for campaigns, user tracking, scoring, and automation workflows.

Workflow Orchestration & Automation

  • Build custom integration flows using n8n to bridge internal/external services (authentication systems, CRM, marketing tools, data APIs).
  • Automate processes combining NiFi + n8n + API gateways.

Scheduling & Orchestration

  • Develop scheduled and event-driven data pipelines using Apache Airflow for cross-system coordination:
    • Full loads
    • Incremental ingestion
    • Daily financial pipelines
    • Dependencies between systems (CRM → Mautic → Analytics → BI)

Internal System Integrations

Integrate internal services such as:

  • WSO2 Gateway
  • Notification Center
  • Identity/SSO
  • Customer Lifecycle Platform
  • Redis cache layers
  • Elasticsearch search layer
  • Portfolio and trading systems
  • Accounting/reporting services

3. Data Modeling & Storage

  • Model and optimize datasets for SQL Server, MySQL, PostgreSQL, MongoDB, ClickHouse, and Elasticsearch.
  • Build analytical layers for BI dashboards, financial reporting, and operational monitoring.
  • Implement schema design for Vector DBs (FAISS, Milvus) and Graph DBs (Neo4j, JanusGraph) for search/recommender use cases.

4. Quality, Performance & Governance

  • Ensure data quality, consistency, lineage, deduplication, referential integrity, and reconciliation.
  • Optimize SQL queries, indexes, partitions, memory usage, and distributed database performance.
  • Document all pipelines, data flows, business logic, and integration layers.

5. Collaboration

  • Work closely with DataOps to deploy pipelines, tune clusters, monitor performance, and design high-availability environments.
  • Collaborate with BI, Analytics, Backend, CRM, and Product teams to deliver correct and reliable datasets.

Required Skills & Qualifications

  • 2–5+ years of experience as a Data Engineer or similar data-intensive engineering role.
  • Strong programming in Python (pandas, NumPy, fastavro, dataflow automation, async processing).
  • Strong SQL experience with SQL Server, MySQL, PostgreSQL.
  • Experience with Kafka, Kafka Connect, and real-time streaming pipelines.
  • Hands-on experience with Apache NiFi for ingestion, routing, transformation, and system integration.
  • Experience integrating systems via REST APIs, SOAP, webhooks, queues, and file-based pipelines.
  • Experience with Dynamics CRM, Mautic, n8n, or similar workflow tools.
  • Experience with Airflow for scheduling, orchestration, and DAG development.
  • Familiarity with MongoDB and ClickHouse for operational/analytical storage.
  • Experience with Elasticsearch for indexing, search, and analytical queries.
  • Understanding of Vector DBs and Graph DBs (or willingness to learn).
  • Experience working fully on-premise (no cloud).

Nice to Have

  • Experience with Flink or distributed compute engines.
  • Familiarity with Redis pipelines, Lua scripts, or caching strategies.
  • Experience with message queues such as RabbitMQ or ActiveMQ.
  • Knowledge of data governance and master data management.
  • Financial market or portfolio analytics background.

Soft Skills

  • Ability to understand complex business rules and turn them into scalable data models.
  • Strong analytical and problem-solving mindset.
  • Excellent communication across multiple teams.
  • Ownership and ability to drive solutions end-to-end.
  • Detail-oriented and structured execution.

What We Offer

  • Work with large-scale, real-time financial data and high-performance architectures.
  • A strong, collaborative Data Platform team.
  • Competitive salary, benefits, and career path.
  • The opportunity to work with advanced technologies: Kafka, NiFi, Elasticsearch, Airflow, Mautic, Dynamics CRM, Redis, Vector/Graph DBs.

 

این آگهی از وبسایت جاب ویژن پیدا شده، با زدن دکمه‌ی تماس با کارفرما، به وبسایت جاب ویژن برین و از اون‌جا برای این شغل اقدام کنین.

هشدار
توجه داشته باشید که دریافت هزینه از کارجو برای استخدام با هر عنوانی غیرقانونی است. در صورت مواجهه با موارد مشکوک،‌ با کلیک بر روی «گزارش مشکل آگهی» به ما در پیگیری تخلفات کمک کنید.
گزارش مشکل آگهی
تماس با کارفرما
این آگهی رو برای دیگران بفرست
نشان کن
گزارش مشکل آگهی
چهارشنبه 27 آذر 1404، ساعت 07:26