
Greetings!!!!
We are looking for a data engineer for one of our premium clients for their Chennai and Tirunelveli location
Required Education/Experience
● Bachelor’s degree in computer Science or related field
● 5-7 years’ experience in the following:
● Snowflake, Databricks management,
● Python and AWS Lambda
● Scala and/or Java
● Data integration service, SQL and Extract Transform Load (ELT)
● Azure or AWS for development and deployment
● Jira or similar tool during SDLC
● Experience managing codebase using Code repository in Git/GitHub or Bitbucket
● Experience working with a data warehouse.
● Familiarity with structured and semi-structured data formats including JSON, Avro, ORC, Parquet, or XML
● Exposure to working in an agile work environment

Similar jobs
- Mandatory (Experience 1): Must have a minimum 4+ YOE in B2B SaaS / Enterprise Tech sales, preferably in ERP, CAD, or Factory Automation tools.
- Mandatory (Experience 2): At least 3+ YOE selling ERP solutions or CAD Softwares.
- Mandatory (Experience 3): Proven track record in offline / field sales and experience in selling complex solutions.
- Mandatory (Tools): Proficient in CRM systems (HubSpot, Zoho, Salesforce, etc.) and lead generation tools (LinkedIn Sales Navigator, Apollo, ZoomInfo, etc.)
- Mandatory (Company): Experience in B2B SaaS Product companies or IT/Tech Services companies.
- Mandatory (Note 1) - Total experience should not be more than 7 Years.
- Mandatory (Note 2) - Overall Expected CTC should not be more than 16L
Preferred
- Preferred (Industry Fit) : Prior exposure to the Manufacturing sector (highly preferred – Garment / Textile Manufacturing).
- Preferred Companies (Competitors): Candidates from Solvei8, Zilingo, Centric Software, Gofrugal, Ginesys (or similar players in ERP/CAD/Manufacturing saas
Job Title: Operations Executive
Location: Kolkata
Job Description:
We are looking for a highly organized and proactive Operations Executive to join our growing team. The ideal candidate will be responsible for managing the backend processes that support smooth day-to-day operations, especially in areas such as travel bookings, billing, chauffeur coordination, and customer support. The role demands a detail-oriented individual with excellent analytical skills and a commitment to accuracy and customer satisfaction.
Key Responsibilities:
1. Operational Support
- Manage and process travel bookings (domestic and international) in coordination with internal teams and third-party vendors.
- Monitor and verify billing records, ensuring all invoices, receipts, and payment details are accurate and aligned with company policies.
- Handle and coordinate chauffeur services, ensuring timely vehicle allocation, schedule adherence, and issue resolution.
2. Customer Interaction
- Respond to customer queries via email and phone in a professional and timely manner.
- Ensure customer satisfaction by providing effective resolutions to complaints, delays, or discrepancies.
- Maintain detailed logs of customer interactions, feedback, and action taken.
3. Data Management & Reporting
- Maintain and update internal databases, spreadsheets, and booking systems to ensure real-time data accuracy.
- Analyze operational data to identify inefficiencies, cost-saving opportunities, and performance trends.
Required Skills:
- Strong numerical and analytical ability; comfort with handling data and making calculations.
- Proficiency in MS Office Suite (Excel – VLOOKUP, Pivot Tables; Word; PowerPoint).
- Excellent communication (verbal and written) in English; additional language proficiency is a plus.
- Ability to handle pressure and multitask efficiently in a fast-paced environment.
- Strong organizational and time-management skills.
- Positive attitude with a problem-solving mindset and the ability to work independently or in a team.
Experience: 0–2 years; prior experience in travel booking, billing, chauffeur services, or back-office operations is highly preferred.
Working Days: 6 days/week
Timing: 11am-9pm
Title: Data Platform / Database Architect (Postgres + Kafka) — AI‑Ready Data Infrastructure
Location: Noida (Hybrid). Remote within IST±3 considered for exceptional candidates.
Employment: Full‑time
About Us
We are building a high‑throughput, audit‑friendly data platform that powers a SaaS for financial data automation and reconciliation. The stack blends OLTP (Postgres), streaming (Kafka/Debezium), and OLAP (ClickHouse/Snowflake/BigQuery), with hooks for AI use‑cases (vector search, feature store, RAG).
Role Summary
Own the end‑to‑end design and performance of our data platform—from multi‑tenant Postgres schemas to CDC pipelines and analytics stores—while laying the groundwork for AI‑powered product features.
What You’ll Do
• Design multi‑tenant Postgres schemas (partitioning, indexing, normalization, RLS), and define retention/archival strategies.
• Make Postgres fast and reliable: EXPLAIN/ANALYZE, connection pooling, vacuum/bloat control, query/index tuning, replication.
• Build event‑streaming/CDC with Kafka/Debezium (topics, partitions, schema registry), and deliver data to ClickHouse/Snowflake/BigQuery.
• Model analytics layers (star/snowflake), orchestrate jobs (Airflow/Dagster), and implement dbt‑based transformations.
• Establish observability and SLOs for data: query/queue metrics, tracing, alerting, capacity planning.
• Implement data security: encryption, masking, tokenization of PII, IAM boundaries; contribute to PCI‑like audit posture.
• Integrate AI plumbing: vector embeddings (pgvector/Milvus), basic feature‑store patterns (Feast), retrieval pipelines and metadata lineage.
• Collaborate with backend/ML/product to review designs, coach engineers, write docs/runbooks, and lead migrations.
Must‑Have Qualifications
• 6+ years building high‑scale data platforms with deep PostgreSQL experience (partitioning, advanced indexing, query planning, replication/HA).
• Hands‑on with Kafka (or equivalent) and Debezium/CDC patterns; schema registry (Avro/Protobuf) and exactly‑once/at‑least‑once tradeoffs.
• One or more analytics engines at scale: ClickHouse, Snowflake, or BigQuery, plus strong SQL.
• Python for data tooling (pydantic, SQLAlchemy, or similar); orchestration with Airflow or Dagster; transformations with dbt.
• Solid cloud experience (AWS/GCP/Azure)—networking, security groups/IAM, secrets management, cost controls.
• Pragmatic performance engineering mindset; excellent communication and documentation.
Nice‑to‑Have
• Vector/semantic search (pgvector/Milvus/Pinecone), feature store (Feast), or RAG data pipelines.
• Experience in fintech‑style domains (reconciliation, ledgers, payments) and SOX/PCI‑like controls.
• Infra‑as‑Code (Terraform), containerized services (Docker/K8s), and observability stacks (Prometheus/Grafana/OpenTelemetry).
• Exposure to Go/Java for stream processors/consumers.
• Lakehouse formats (Delta/Iceberg/Hudi).
- Experience building applications using NodeJS and frameworks such as Express.
- Thorough understanding of React.js and NodeJS including its core principles.
- Ability to understand business requirements and translate them into technical requirements.
- Familiarity with code versioning tools (such as Git, SVN, and Mercurial).
- Understanding the nature of asynchronous programming and its quirks and workarounds
- Strong experience with MongoDB, Postgres
- Highly proficient with Vue.js framework and its core principles such as components, reactivity, and the virtual DOM
- Familiarity with the Vue.js ecosystem, including Vue CLI, Vuex, Vue Router
- Good understanding of HTML5 and CSS3, and Sass
- Understanding of server-side rendering and its benefits and use cases
About the Role:
As a Solution Design team member, you will be focused on designing solutions for some of the most challenging and exciting problems that we are working to solve in the logistics industry. You will be responsible for understanding the customer's business, supply chain challenges and building practical processes or technologically led solutions for these. This role is varied and fast-paced – constantly adapting to the logistics industry's landscape and business needs.
Key responsibilities:
- Study the customer's supply chain process and build a deeper understanding of industry-specific supply chains, their underlying & fundamental challenges, and build holistic solutions.
- Analyze data to come up with actionable insights.
- Drive implementation of complex engagements.
- Become a knowledge powerhouse within the organization for anything related to logistics.
- Develop holistic business requirements and drive product development while working with the product and technology team.
- Take ownership of complex projects, work with cross-functional teams and drive the projects to completion. Be accountable for the overall technical excellence and quality of the technical output.
- Educate and support customers, both pre-and post-sales, helping them with implementation, testing, integrations, and more.
Preferred qualifications:
- MBA with 4+ years of solid experience in Logistics.
- Good knowledge of logistics, preferably within Steel, Cement, FMCG, Transportation or related industries.
- Good to have experience working with B2B product-based organizations.
- Ability to understand the processes and cost drivers of customers from different industries.
- Strong analytical skills, with the ability to translate data into insights.
- Self motivated, result - oriented with a bias for speed and action.
- Good verbal, written, social, presentation and interpersonal skills.
- Ability to thrive in a multi-tasking environment and adjust priorities on-the-fly while still focusing on details and being analytical.
- You are involved in identifying complex fuzzy problems, break them down into smaller parts and implement creative, data-driven solutions.
- You are responsible for defining, analysing and communicating key metrics and business trends to the management teams. Prepare and deliver business reviews to the management team regarding progress and roadblocks
- Your role involves partnering with operations/business teams to consult, develop and implement KPIs, automated reporting/process solutions and data infrastructure improvements to meet business needs. Design, develop and maintain scaled, automated, user-friendly systems, reports, dashboards, etc. that will support our business needs.
You’re Array (Nice to Haves)
- Exceptional written and verbal communication skills.
- Ability to multitask and work on a range of requirements.
- Experience in analysing very large, complex, multi-dimensional data sets.
- Penchant for business, curiosity about numbers and persistence to work with data to tease out insights.
- Product Analytics tools (either of Amplitude, Mixpanel, Moengage)
You (Must Haves)
- Bachelors in Engineering, Computer Science, Math, Statistics, or a related discipline from a reputed institute or an MBA from a reputed institute.
- You have 1+ Years of experience in relevant role
- Strong problem solving and analytical skills.
- Tableau
- Technical capabilities: Bigquery, SQL and Scripting Language
Experience: 3 to 5 Years
Roles and Responsibilities
- Responsible for validating the health of IoT devices.
- Design, implement the test scripts to validate the IoT device functionality
- Monitor and Analyze the large data sets
- Provide support and respond to Product-specific escalations
- Build the system which detects the issues from the large data sets.
Essential Skills
- Minimum of 2 years in device testing
- Automation and scripting experience is plus
- Experience with any DBMS systems i.e. Postgres / Oracle / Mongo DB
- Excellent communication skills (Written and verbal).
- Ability to work effectively in a team environment.
- Motivated and natural curiosity in solving problems.
- Strong Ownership
Qualifications and Education Requirements
BE/ BTECH in a related field

The Android Developer will design and develop Native Android and Hybrid applications to meet the
functional requirements of products developed by STPL & Trinity Axis Inc. Develop native android
applications including front-end and back-end code. Make use of well-established design patterns and
architectures while following agile development processes. Perform unit testing of the code
developed, and develop unit test cases and test harnesses. Perform system integration and bug fixing,
develop user documentation, generate relevant reports and review similar work done by peers.
Measures of Success
Read and understand high level product descriptions or requirement documents and propose
one or more software designs at the module level that are highly reusable and subject to the
design principles of SaaS Platforms
Decompose design elements into structured code as per prevailing coding guidelines,
preparation and execution of unit test cases and development of test codes or test harnesses.
Trace back through code, design and resolve issues and bugs
Document work, software designs, code and record and produce test reports and release
notes.
Plan, organize and execute assignments with very little or moderate supervision
Code individually and willing to participate in every aspect of the android application
development process
Responsible for deliveries in the required deadlines. Deliveries can be modules,
documentation, customer releases etc.
Coordinate with the team for timely delivery of work products. Ensure the quality of products
by reviews
Sub-Module level responsibility in large projects and Module (or component) level
responsibility in small / medium sized projects and complete responsibility in small sized
projects, depending upon the complexity and decomposition
Work with the QA team to ensure the validity of the solution
Participate in the recruiting of team members as and when deemed necessary
Be contemporary by adopting technological and market evolutions
Other responsibilities as assigned by management from time to time
Skills & Experience Required
Good understanding of software development methodologies tools, techniques, practices
and excellent problem-solving skills in various applications- mobile, tablet, kiosk, handheld
device
Solid hands-on programming experience with RESTful APIs
Strong Experience in developing Native Android applications using Android Studio/ Eclipse
Experience with cross platform tools such as React- Native/ Ionic/ Flutter/ Cordova
Experience in developing Hybrid applications (ios & android)
Experience with offline storage, threading, and performance tuning
Hand-on experience in sqlite database integration
Knowledge of Chart API and other Google APIs and UI & UX desig ning.
Experience in Android SDK
Integration with backend services
Experience with IOT related mobile application projects
Experience in IOT related projects (monitoring, controlling, OTA updates)
Worked with Embedded related projects and embedded teams
Worked in BLE connectivity with devices.
Experience or knowledge of chat applications.
Experience in Barcode, RFID integration with android devices.
Ability to integrate various libraries and SDKs
Worked on interactive Android Kiosk applications
Roles and responsibilities
- Develop well-designed performant and scalable applications and microservices
- Writing reusable, testable, and efficient code aligning to software development best practices
- Integrate data storage solutions including databases, key-value stores, blob stores, etc.
- Build integrations with 3rd party applications through apis’ to ingest and process data
- Develop state-of-the-art analytics tools to support diverse tasks ranging from ad hoc analysis to production-grade pipelines and workflows for customer applications
- Ensure security and data protection aspects within the applications
- Partner with Data Scientists and Analytics Engineers to improve the performance and reliability of advanced algorithms
- Ensure high performance and availability of distributed systems and applications
- Interact directly with client project team members and operational staff to support live customer deployments and production issues
- 4+ years of experience in developing applications using python and related technologies.
- Familiarity in data ingestion and processing libraries in python.
- Thorough understanding of REST and GRPC technologies.
- Experience in using ORM (Object Relational Mapper) libraries for data access.
- Experiencing in developing and hosting APIs and integration with external applications.
- Experience in building data models and repositories using relational and NoSql databases.
- Knowledge of JIRA, Bitbucket and agile methodologies.
- Good to have knowledge of AWS services like Lambda, dynamodb, kinesis and others.
- Understanding of fundamental design principles behind a scalable application
- Familiarity with event-driven programming
- Strong unit test and debugging skills
- Affinity for learning and applying new technologies and solving new problems
- Effective organizational skills with strong attention to detail
- Experience in working with docker is a plus
- Comfortable in working with Unix/Linux environment
- Strong communication skills — both written and verbal








