At Accion Labs, we believe in Driving Outcomes Through Actions. We build the solutions of tomorrow by combining innovation, engineering excellence, and a strong product‑ownership mindset. Our teams collaborate globally with talented engineers, architects, data specialists, and product owners to deliver impactful, human‑centered technology that accelerates digital transformation.
Our mission is to improve lives through purpose‑driven technology innovation, helping organizations modernize through AI‑powered engineering, cloud-native solutions, automation, and advanced analytics. Join us and be part of a forward‑thinking environment where innovation, learning, and engineering excellence are core to everything we do.
About the Role
We are seeking a highly skilled Senior Cognitive Data Engineer to join our innovative team. This role represents the next evolution of data engineering, combining traditional expertise with cutting‑edge AI capabilities to revolutionize data workflows and intelligence extraction.
You will design, develop, and maintain scalable, AI‑driven data pipelines and infrastructure—leveraging advanced AI tools and semantic data understanding to deliver autonomous, intelligent data solutions. Reporting to the Head of Data Engineering, you'll play a pivotal role in evolving our engineering standards, automation capabilities, and AI‑powered data systems, with a focus on the unique data challenges within the hospitality industry.
Core Focus Areas
- Produce high‑quality, maintainable code using traditional engineering practices augmented by AI tools such as Cursor IDE, GitHub Copilot, and Claude API.
- Leverage AI technologies to optimize:
- Code generation and automated refactoring
- Test case generation and validation
- Performance tuning for data pipelines
- Operational excellence through intelligent automation
- Apply advanced prompt‑engineering techniques to achieve meaningful outcomes using AI tools.
- Build reusable, automated frameworks for autonomous data quality assessment, pipeline optimization, and self‑healing systems.
- Develop semantic data layers and intelligent agents to enhance data‑context understanding and automation.
- Utilize Cursor IDE for AI‑assisted pair programming, including intelligent code completion, automated refactoring, and AI‑powered debugging.
- Integrate Claude API for natural‑language data querying, automated documentation, and intelligent transformation suggestions.
Technical Excellence:
- Design and implement scalable data pipelines using Python, PySpark, and SQL to process large datasets.
- Build and manage data infrastructure using AWS services including EMR, S3, Redshift, Lambda, Glue, and Bedrock.
- Debug and optimize complex data workflows for performance, scalability, and reliability.
- Use Jenkins and Terraform for CI/CD pipelines and infrastructure‑as‑code deployments.
- Collaborate with cross‑functional teams in an Agile environment.
- Clearly communicate technical solutions to stakeholders at different levels.
- Stay current on industry trends and contribute to ongoing innovation and hackathon initiatives.
Required Qualifications:
- 12–15 years of experience in data engineering, including at least 3 years in team leadership.
- Expert‑level proficiency in Python, PySpark, and SQL.
- Deep experience with AWS cloud services (EMR, S3, Redshift, Lambda, Glue).
- Proficiency with Cursor IDE, GitHub Copilot, and Claude API for AI‑assisted development.
- Experience with CI/CD tools (Jenkins) and infrastructure‑as‑code (Terraform).
- Proven track record in implementing test‑automation frameworks and optimizing data workflows.
- Strong problem‑solving abilities, innovation mindset, and clear communication of complex ideas.
- Hands‑on experience working within Agile methodologies.
- Participation in innovation events or hackathons.
Certifications:
- Cloud certifications (e.g., AWS Solutions Architect, AWS Data Analytics, AWS Cloud Data Engineer).
Additional Qualifications:
- Exposure to AI/ML frameworks or modern big‑data ecosystems (e.g., Kafka, Kinesis).
- Experience with hospitality industry data systems (e.g., PMS platforms such as Opera, OnQ, FSPMS).
- Knowledge of advanced AI techniques (graph neural networks, causal AI, quantum‑inspired algorithms).
- Familiarity with data governance, AI ethics, and privacy‑preserving techniques (differential privacy, federated learning).