Kafka + Data Pipeline Engineer (Remote / Contract) at Century 21 Accent Homes | Torre

Kafka + Data Pipeline Engineer (Remote / Contract)

Build advanced messaging systems with Kafka & AI tech for a legacy-driven real estate leader since 1973.
Emma highlights
This highlight was written by Emma’s AI. Ask Emma to edit it.
Full-time

Legal agreement: Depends on the location of the candidate

Currency exchange and taxes to be paid by:

Depends on the location of the candidate

Compensation
USD2.8k - 4k/month
Non-negotiable
location_on
Remote (anywhere)
skeleton-gauges
You have opted out of job matches in .
To undo this, go to the 'Skills and Interests' section of your preferences.
Review preferences
Posted about 1 year ago

Requirements and responsibilities


🛠️ Responsibilities * Design and implement Kafka topics for processing communication data from Gmail, LeadSimple, and other sources. * Build a middleware API to: - Trigger on incoming Gmail messages. - Enrich messages using LeadSimple’s API. - (Optionally) perform Rent Manager lookups for property metadata. * Filter out spam and irrelevant messages using logic such as: - Sender whitelisting. - LeadSimple contact validation. * Maintain a conversation_context table using Tableflow or ksqlDB for enriched thread metadata. * Ensure messages are properly routed to Kafka (communication-events) with clean, AI-ready JSON. * Collaborate with internal teams to define: - Message schemas. - Enrichment fields. - Downstream AI use cases. * Set up lightweight monitoring and logging for errors and failed enrichments. * Advise on infrastructure best practices (e.g., using Redis for caching, managing Pub/Sub backpressure, etc.) ✅ Requirements * Proven experience working with Kafka (Confluent Cloud or self-hosted). * Hands-on experience with: - Kafka Streams or ksqlDB. - REST API integrations (especially Gmail API and/or CRMs like LeadSimple). * Proficiency in Python, Node.js, or similar backend languages. * Familiarity with event-driven architecture and streaming design patterns. * Experience with at least one cloud provider (AWS, GCP, or Azure). * Solid understanding of: - Asynchronous job handling. - Retry logic. - Webhook workflows. * Ability to structure clean, enriched JSON events for AI, analytics, and automation. * Excellent communication skills to clearly explain technical concepts to ops-minded teams. 🎯 Nice-to-Have * Experience with OpenAI API, LangChain, or vector databases like Pinecone or Chroma. * Experience building agent-style tools (e.g., Slack bots or AI copilots). * Prior exposure to property management systems or service scheduling tools. * Experience with Tableflow or other streaming-state tools.
Optionally, you can add more information later (benefits, pre-screening questions, etc.)
check_circle

Payment confirmed

A member of the Torre team will contact you shortly

In the meantime, continue adding information to your job opening.