Sentienz
http://sentienz.comAbout
Sentienz is a technology company specializing in next-generation IT engineering solutions. They focus on AI, Data Analytics, Distributed Computing, Product Development, Platform Development, and IoT. The company has recently won the Startup Trailblazer category at LEAP India Startup Summit 2024 for their product Akiro
Tech stack
Candid answers by the company
Sentienz is a technology company founded in 2016 that specializes in two core areas:
- Product:
- Akiro - An AI-powered platform for IoT connectivity, focusing on smart meters, EV systems, and utilities
- Technology Services:
- AI & Data Intelligence solutions
- Product & Platform Development
- DevOps & MLOps services
Think of Sentienz as a tech partner that helps businesses:
- Build and scale their technology products
- Implement AI and data analytics solutions
- Modernize their IT operations
They primarily serve industries like finance, healthcare, telecom, and smart cities, with their headquarters in Bangalore. The company has grown to 11-50 employees and recently won recognition as a Startup Trailblazer at LEAP India Startup Summit 2024.
Jobs at Sentienz
Key Responsibilities
- Design, develop, and optimize data pipelines using Apache Spark to process large volumes of structured and unstructured data.
- Write efficient and maintainable code in Scala and Python for data extraction, transformation, and loading (ETL) operations.
- Collaborate with cross-functional teams to define data engineering solutions to support analytics and machine learning initiatives.
- Implement and maintain data lake and warehouse solutions using cloud platforms (e.g., AWS, GCP, Azure).
- Ensure data workflows and distributed systems' performance, scalability, and reliability.
- Perform data quality assessments, implement monitoring, and improve data governance practices.
- Assist in migrating and refactoring legacy data systems into modern distributed data processing platforms.
- Provide technical leadership and mentorship to junior engineers and contribute to best practices in coding, testing, and deployment.
Qualifications
- Bachelor's or Master’s degree in Computer Science, Engineering, or a related field.
- 6+ years of hands-on experience in data engineering, with strong skills in Apache Spark, Scala, and Python.
- Experience with distributed data processing frameworks and real-time data processing.
- Strong experience with big data technologies such as Hadoop, Hive, and Kafka.
- Proficient with relational databases (SQL, PostgreSQL, MySQL) and NoSQL databases (Cassandra, HBase, MongoDB).
- Knowledge of CI/CD pipelines and DevOps practices for deploying data workflows.
- Strong problem-solving skills and experience with optimizing large-scale data systems.
- Excellent communication and collaboration skills.
- Experience with orchestration tools like Airflow
- Experience with containerization and orchestration (e.g., Docker, Kubernetes)
Key Responsibilities
- Design, develop, and optimize data pipelines using Apache Spark to process large volumes of structured and unstructured data.
- Write efficient and maintainable code in Scala and Python for data extraction, transformation, and loading (ETL) operations.
- Collaborate with cross-functional teams to define data engineering solutions to support analytics and machine learning initiatives.
- Implement and maintain data lake and warehouse solutions using cloud platforms (e.g., AWS, GCP, Azure).
- Ensure data workflows and distributed systems' performance, scalability, and reliability.
- Perform data quality assessments, implement monitoring, and improve data governance practices.
- Assist in migrating and refactoring legacy data systems into modern distributed data processing platforms.
- Provide technical leadership and mentorship to junior engineers and contribute to best practices in coding, testing, and deployment.
Qualifications
- Bachelor's or Master’s degree in Computer Science, Engineering, or a related field.
- 8+ years of hands-on experience in data engineering, with strong skills in Apache Spark, Scala, and Python.
- Experience with distributed data processing frameworks and real-time data processing.
- Strong experience with big data technologies such as Hadoop, Hive, and Kafka.
- Proficient with relational databases (SQL, PostgreSQL, MySQL) and NoSQL databases (Cassandra, HBase, MongoDB).
- Knowledge of CI/CD pipelines and DevOps practices for deploying data workflows.
- Strong problem-solving skills and experience with optimizing large-scale data systems.
- Excellent communication and collaboration skills.
- Experience with orchestration tools like Airflow
- Experience with containerization and orchestration (e.g., Docker, Kubernetes)
Similar companies
Fractal Analytics
About the company
Fractal is one of the most prominent players in the Artificial Intelligence space.Fractal's mission is to power every human decision in the enterprise and brings Al, engineering, and design to help the world's most admire Fortune 500® companies.
Fractal's products include Qure.ai to assist radiologists in making better diagnostic decisions, Crux Intelligence to assist CEOs and senior executives make better tactical and strategic decisions, Theremin.ai to improve investment decisions, Eugenie.ai to find anomalies in high-velocity data, Samya.ai to drive next-generation Enterprise Revenue Growth Manage- ment, Senseforth.ai to automate customer interactions at scale to grow top-line and bottom-line and Analytics Vidhya is the largest Analytics and Data Science community offering industry-focused training programs.
Fractal has more than 3600 employees across 16 global locations, including the United States, UK, Ukraine, India, Singapore, and Australia. Fractal has consistently been rated as India's best companies to work for, by The Great Place to Work® Institute, featured as a leader in Customer Analytics Service Providers Wave™ 2021, Computer Vision Consultancies Wave™ 2020 & Specialized Insights Service Providers Wave™ 2020 by Forrester Research, a leader in Analytics & Al Services Specialists Peak Matrix 2021 by Everest Group and recognized as an "Honorable Vendor" in 2022 Magic Quadrant™™ for data & analytics by Gartner. For more information, visit fractal.ai
Jobs
4
HyrHub
About the company
Jobs
23
Starmark Software
About the company
Jobs
2
Apprication pvt ltd
About the company
Jobs
10
Oddr Inc
About the company
Oddr is the legal industry’s only AI-powered invoice-to-cash platform. Oddr’s AI-powered platform centralizes, streamlines and accelerates every step of billing + collections— from bill preparation and delivery to collections and reconciliation - enabling new possibilities in analytics, forecasting, and client service that eliminate revenue leakage and increase profitability in the billing and collections lifecycle.
www.oddr.com
Jobs
7
Shipthis Inc
About the company
At Shipthis, we work to build a better future and make meaningful changes in the freight forwarding industry. Our team members aren't just employees. We are comprised of bright, skilled professionals with a single straightforward goal—to Evolve Freight forwarders towards Digitalized operations, enhance efficiency, and drive lasting change.
As a company, we're just the right size for every person to take initiative and make things happen. Join us in reshaping the future of logistics and be part of a journey where your contributions make a tangible difference.
Jobs
3
Avalon Solution
About the company
Jobs
20
Brainayan
About the company
Jobs
2
Force Identification Private Limited
About the company
Jobs
1
Conections One Business Services Private Limited
About the company
Jobs
2