Job
- Level
- Senior
- Job Feld
- BI, Data
- Anstellung
- Vollzeit
- Vertragsart
- Unbefristetes Dienstverhältnis
- Ort
- Wien
- Arbeitsmodell
- Full Remote, Hybrid
KI Zusammenfassung
In diesem Job entwickelst du robuste Datenpipelines und kümmerst dich um die Integration verschiedener Datenquellen. Deine Hauptaufgabe ist die Optimierung der Dateninfrastruktur, um die Anforderungen unserer AI-Produkte zu erfüllen.
Job Technologien
Deine Rolle im Team
- Are you a seasoned Data professional, passionate about building and maintaining robust data pipelines and infrastructure?
- Are you ready to join us in our pivot towards an AI-future?
- In this role, you will be instrumental in ensuring the smooth flow of data, enabling critical data-driven insights, and supporting the increasing demands of our newest AI product.
- You will collaborate closely with the Data Engineering team, Data Analytics, Product teams and other Engineering teams to deliver high-quality data solutions.
- If you love solving complex data challenges with tech and thrive in fast-paced, cross-functional environments, this role is for you!
Unsere Erwartungen an dich
Qualifikationen
- Python Pro. You have solid expertise with Python and follow industry best practices.
- Integration Expert. You will create new integrations between diverse data sources and our data warehouse.
- Cost Conscious. You are adept at keeping data infrastructure costs under control and never lose sight of our spending.
- Proactive Problem Solver. You proactively identify and address areas for improvement in our data infrastructure.
- Strategic Contributor. You will contribute to strategic decisions regarding data infrastructure and architecture, making sure our set is efficient and future-proof.
Erfahrung
- 5+ Years' Experience. You've worked in the data space as an Engineer and have a proven track record of building complex data pipelines in AI-empowered environments.
- Technical Acumen. You are proficient in SQL and have experience with data warehousing tools like BigQuery.
- Data Stack. You have hands-on experience with tools like Airflow and dbt, which are essential for our data operations.
- Data Pipeline Ownership. You will maintain and develop data pipelines, ensuring their reliability and efficiency. Less experienced team members can count on your guidance and mentorship.
- Cloud Credentials. You know your way around platforms (ideally GCP) and have worked with PubSub before. Experience with IaC and Kubernetes is a plus!
Benefits
Gesundheit, Fitness & Fun
Work-Life-Integration
Job Standorte
Themen mit denen du dich im Job beschäftigst
Das ist dein Arbeitgeber

MeisterLabs Sofware GmbH
Wien
Unser Meister entwickelt intelligente und intuitiv zu bedienende Web-Apps, die Teams aller Größen und Industrien helfen, Ideen in Realität umzusetzen. Unsere Flaggschiffprodukte MindMeister und MeisterTask unterstützen einen kompletten kreativen Workflow - von kooperativer Brainstorming bis agilem Task Management.
Description
- Unternehmensgröße
- 1-49 Employees
- Gründungsjahr
- 2006
- Sprachen
- Englisch
- Unternehmenstyp
- Startup
- Arbeitsmodell
- Full Remote, Hybrid, Onsite
- Branche
- Internet, IT, Telekom
Dev Reviews
by devworkplaces.com
Gesamt
(2 Bewertungen)4.4
Engineering
4.4Workingconditions
4.6Career Growth
4.1Culture
4.5