• Specialist Solutions Engineer (Remote, USA)

    ConfluentSaint Paul, MN 55145

    Job #2810810710

  • With Confluent, organizations can harness the full power of continuously flowing data to innovate and win in the modern digital world. We have a purpose that drives us to do better every day - we're creating an entirely new category within data infrastructure - data streaming. This technology will allow every organization to create experiences and use the power of data in ways that profoundly impact the way we all live. This impact is our purpose and drives us to do better every day.

    One Confluent. One team. One Data Streaming Platform.

    Data Connects Us.

    About the Role:

    We're seeking an innovative and experienced technologist to join our Global Specialist Solutions Engineering team-a group dedicated to solving complex, real-time data challenges and optimizing Data Lake, Data Warehouse, and Data Analytics architectures.

    As a trusted technical advisor, you will collaborate closely with product management and engineering, serving as a key advocate for Confluent's platform. This role demands expertise in Kafka, distributed systems, and pre-sales engineering, while engaging with cross-functional teams to drive product success and customer outcomes.

    Ideal candidates may have backgrounds in Technical Account Management, Sales Engineering, or Professional Services.

    What You Will Do:

    • Enable Customers with Real-Time Architectures

    • Understand customer challenges with traditional Data Warehouses, Data Lakes, and Batch Analytics workflows, and guide them toward real-time, distributed architectures using Kafka, Flink, Kafka Streams, and modern ETL/ELT frameworks.

    • Help customers optimize their data platforms by focusing on early-stage data enrichment, filtering, and processing for better performance and cost efficiency.

    • Provide Technical Expertise

    • Assist customers and sales teams in designing, deploying, and optimizing real-time data streaming platforms, integrating Kafka with distributed processing, and ensuring alignment with business goals.

    • Architect solutions to unify operational and analytical workloads, enabling a data mesh or streaming-first architecture.

    • Collaborate Across Teams

    • Partner with sales, product management, marketing, and engineering to translate customer feedback into continuous improvements for Confluent's platform.

    • Work cross-functionally to refine product messaging and technical positioning, ensuring clarity and alignment with customer needs.

    • Drive Thought Leadership

    • Lead technical workshops, webinars, and sessions to showcase the advanced capabilities of Kafka, Kafka Connect, Flink, and other data streaming tools.

    • Advocate for best practices in real-time architectures through case studies, training sessions, and field engagement with both technical and business stakeholders.

    What You Will Bring:

    • Core Skills & Expertise

    • Proven experience architecting and implementing enterprise-scale, real-time data streaming solutions with Kafka and related ecosystems.

    • Deep knowledge of Data Warehouses, Data Lakes, Lakehouses, and Distributed Systems, including CDC Connectors, ETL tools, and RDBMS platforms.

    • Expertise in distributed data processing, data governance, and real-time analytics.

    • Hands-on programming experience with Python, Java, or similar, especially for building streaming and ETL pipelines.

    • Customer & Business Impact

    • A demonstrated ability to become a trusted advisor to Chief Data Officers, Data Engineers, Data Scientists, Data Analysts, and Architects, driving customer success and platform adoption.

    • Strong pre-sales engineering skills, including technical presentations, proof-of-concept development, and hands-on solution design.

    • Collaboration & Innovation

    • Proven ability to work across cross-functional teams, including product management, engineering, and marketing, to drive shared outcomes.

    • A passion for learning and adapting to emerging technologies, particularly within the Kafka ecosystem.

    What Gives You an Edge:

    • Extensive experience with modern data architectures, including Data Warehouses, Lakehouses, data formats (e.g., Avro, Parquet), and cloud-native platforms (AWS, GCP, Azure).

    • Expertise in integrating Kafka-based solutions with cloud services and enterprise data ecosystems.

    • Demonstrated success designing and implementing distributed streaming systems for large-scale enterprises, including Fortune 1000 companies.

    Come As You Are

    At Confluent, equality is a core tenet of our culture. We are committed to building an inclusive global team that represents a variety of backgrounds, perspectives, beliefs, and experiences. The more diverse we are, the richer our community and the broader our impact. Employment decisions are made on the basis of job-related criteria without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, or any other classification protected by applicable law.

    At Confluent, we are committed to providing competitive pay and benefits that are in line with industry standards. We analyze and carefully consider several factors when determining compensation, including work history, education, professional experience, and location. This position has an annual estimated salary of $136,700 - 160,650, a competitive equity package and is also eligible for additional commission and/or bonus pay. The actual pay may vary depending on your skills, qualifications, experience, and work location. In addition, Confluent offers a wide range of employee benefits. To learn more about our benefits click HERE (~~~/) .

    Click HERE (~~~/) to review our Candidate Privacy Notice which describes how and when Confluent, Inc., and its group companies, collects, uses, and shares certain personal information of California job applicants and prospective employees.

    #LI-Remote