
The Sr. Analytics Engineer would provide technical expertise in needs identification, data modeling, data movement, and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective.
Understands and leverages best-fit technologies (e.g., traditional star schema structures, cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges.
Provides data understanding and coordinates data-related activities with other data management groups such as master data management, data governance, and metadata management.
Actively participates with other consultants in problem-solving and approach development.
Responsibilities :
Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs.
Perform data analysis to validate data models and to confirm the ability to meet business needs.
Assist with and support setting the data architecture direction, ensuring data architecture deliverables are developed, ensuring compliance to standards and guidelines, implementing the data architecture, and supporting technical developers at a project or business unit level.
Coordinate and consult with the Data Architect, project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels.
Work closely with Business Analysts and Solution Architects to design the data model satisfying the business needs and adhering to Enterprise Architecture.
Coordinate with Data Architects, Program Managers and participate in recurring meetings.
Help and mentor team members to understand the data model and subject areas.
Ensure that the team adheres to best practices and guidelines.
Requirements :
- Strong working knowledge of at least 3 years of Spark, Java/Scala/Pyspark, Kafka, Git, Unix / Linux, and ETL pipeline designing.
- Experience with Spark optimization/tuning/resource allocations
- Excellent understanding of IN memory distributed computing frameworks like Spark and its parameter tuning, writing optimized workflow sequences.
- Experience of relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., Redshift, Bigquery, Cassandra, etc).
- Familiarity with Docker, Kubernetes, Azure Data Lake/Blob storage, AWS S3, Google Cloud storage, etc.
- Have a deep understanding of the various stacks and components of the Big Data ecosystem.
- Hands-on experience with Python is a huge plus

Similar jobs
As a Senior Backend & Infrastructure Engineer, you will take ownership of backend systems and cloud infrastructure. You’ll work closely with our CTO and cross-functional teams (hardware, AI, frontend) to design scalable, fault- tolerant architectures and ensure reliable deployment pipelines.
- What You’ll Do :
- Backend Development: Maintain and evolve our Node.js (TypeScript) and Python backend services with a focus on performance and scalability.
- Cloud Infrastructure: Manage our infrastructure on GCP and Firebase (Auth, Firestore, Storage, Functions, AppEngine, PubSub, Cloud Tasks).
- Database Management: Handle Firestore and other NoSQL DBs. Lead database schema design and migration strategies.
- Pipelines & Automation: Build robust real-time and batch data pipelines. Automate CI/CD and testing for backend and frontend services.
- Monitoring & Uptime: Deploy tools for observability (logging, alerts, debugging). Ensure 99.9% uptime of critical services.
- Dev Environments: Set up and manage developer and staging environments across teams.
- Quality & Security: Drive code reviews, implement backend best practices, and enforce security standards.
- Collaboration: Partner with other engineers (AI, frontend, hardware) to integrate backend capabilities seamlessly into our global system.
Must-Haves :
- 5+ years of experience in backend development and cloud infrastructure.
- Strong expertise in Node.js (TypeScript) and/or Python.
- Advanced skills in NoSQL databases (Firestore, MongoDB, DynamoDB...).
- Deep understanding of cloud platforms, preferably GCP and Firebase.
- Hands-on experience with CI/CD, DevOps tools, and automation.
- Solid knowledge of distributed systems and performance tuning.
- Experience setting up and managing development & staging environments.
• Proficiency in English and remote communication.
Good to have :
- Event-driven architecture experience (e.g., Pub/Sub, MQTT).
- Familiarity with observability tools (Prometheus, Grafana, Google Monitoring).
- Previous work on large-scale SaaS products.
- Knowledge of telecommunication protocols (MQTT, WebSockets, SNMP).
- Experience with edge computing on Nvidia Jetson devices.
What We Offer :
- Competitive salary for the Indian market (depending on experience).
- Remote-first culture with async-friendly communication.
- Autonomy and responsibility from day one.
- A modern stack and a fast-moving team working on cutting-edge AI and cloud infrastructure.
- A mission-driven company tackling real-world environmental challenges.
We’re looking for a Backend Developer (Python) with a strong foundation in backend technologies and
a deep interest in scalable, low-latency systems.
Key Responsibilities
• Develop, maintain, and optimize backend applications using Python.
• Build and integrate RESTful APIs and microservices.
• Work with relational and NoSQL databases for data storage, retrieval, and optimization.
• Write clean, efficient, and reusable code while following best practices.
• Collaborate with cross-functional teams (frontend, QA, DevOps) to deliver high quality features.
• Participate in code reviews to maintain high coding standards.
• Troubleshoot, debug, and upgrade existing applications.
• Ensure application security, performance, and scalability.
Required Skills & Qualifications:
• 2–4 years of hands-on experience in Python development.
• Strong command over Python frameworks such as Django, Flask, or FastAPI.
• Solid understanding of Object-Oriented Programming (OOP) principles.
• Experience working with databases such as PostgreSQL, MySQL, or MongoDB.
• Proficiency in writing and consuming REST APIs.
• Familiarity with Git and version control workflows.
• Experience with unit testing and frameworks like PyTest or Unittest.
• Knowledge of containerization (Docker) is a plus.
At Shipthis, we work to build a better future and make meaningful changes in the freight forwarding industry. Our team members aren't just employees. We are comprised of bright, skilled professionals with a single straightforward goal – to Evolve Freight forwarders towards Digitalized operations, enhancing efficiency, and driving lasting change.
As a company, we're just the right size for every person to take initiative and make things happen. Join us in reshaping the future of logistics and be part of a journey where your contributions make a tangible difference.
Learn more at www.shipthis.co
Job Description
Who are we looking for?
We are seeking a skilled developer experienced in Python with end-to-end project implementation to join our team.
What will you be doing?
- Design and develop backend services for the ERP system using Python and MongoDB
- Collaborate with the frontend development team to integrate the frontend and backend functionalities
- Develop and maintain APIs that are efficient, scalable, and secure
- Write efficient and reusable code that can be easily maintained and updated
- Optimize backend services to improve performance and scalability
- Troubleshoot and resolve backend issues and bugs
Desired qualifications include
- Bachelor’s degree in computer science or a related field
- Proven experience in Python Fast API with E2E project implementation
- Proficiency with DevOps and Pipelines (Git actions, Google Cloud Platform)
- Knowledge of microservices architecture
- Experience in MongoDB development, including Aggregation
- Proficiency in RESTful API development
- Experience with the Git version control system
- Strong problem-solving and analytical skills
- Ability to work in a fast-paced environment
We welcome candidates
- Who is an Immediate Joiner
- Female candidates returning to work after a career break are strongly encouraged to apply
- Whether you're seasoned or just starting out, if you have the skills and passion, we invite you to apply.
We are an equal-opportunity employer and are committed to fostering diversity and inclusivity. We do not discriminate based on race, religion, color, gender, sexual orientation, age, marital status, or disability status
JOB SYNOPSIS
- Location: Bangalore
- Job Type: Full-time
- Role: Software Developer
- Industry Type: Software Product
- Functional Area: Software Development
- Employment Type: Full-Time, Permanent
Key Skills required (Items in Bold are mandatory keywords) :
1. Proficiency in Python & Django
2. Solid understanding of Python concepts
3. Experience with some form of Machine Learning (ML)
4. Experience in using libraries such as Numpy and Pandas
5. Some form of experience with NLP and Deep Learning using any of Pytorch, Tensorflow, Keras, Scikit-learn or similar
6. Hands on experience with RDBMS such as Postgres or MySQL
7. Experience building REST APIs using DRF or Flask
8. Comfort with Git repositories, branching and deployment using Git
9. Working experience with Docker
10. Experience in deploying Django applications to AWS,Digital Ocean or Heroku
Job Description
Job Description SQL DBA - Trainee Analyst/Engineer
Experience : 0 to 1 Years
No.of Positions: 2
Job Location: Bangalore
Notice Period: Immediate / Max 15 Days
The candidate should have strong SQL knowledge, Here are few points
- Implement and maintain the database design
- Create database objects (tables, indexes, etc.)
- Write database procedures, functions, and triggers
Good soft skills are a must (written and verbal communication)
Good team player
Ability to work in a 24x7 support model (rotation basis)
Strong fundamentals in Algorithms, OOPs and Data Structure
Should be flexible to support multiple IT platform
Analytical Thinking
Additional Information :
Functional Area: IT Software - DBA, Datawarehousing
Role Category: Admin/ Maintenance/ Security/ Datawarehousing
Role: DBA
Education :
B.Tech/ B.E
Skills
SQL DBA, IMPLEMENTATION, SQL, DBMS, DATA WAREHOUSING
-
3 + years expertise in Python 3.7, Django 2 (or Django 3).
-
Familiarity with some ORM (Object Relational Mapper) libraries.
-
Able to integrate multiple data sources and databases into one system.
-
Integration of user-facing elements developed by front-end developers with server-side logic in Django (RESTful APIs).
-
Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3
-
Knowledge of user authentication and authorization between multiple systems, servers, and environments
-
Understanding of the differences between multiple delivery platforms, such as mobile vs desktop, and optimizing output to match the specific platform
-
Able to create database schemas that represent and support business processes
-
Strong unit test and debugging skills.
-
Proficient understanding of code versioning tools such as Git.
The desirablesoptionals
-
Django Channels, Web Sockets, Asyncio.
-
Experience working with AWS or similar Cloud services.
-
Experience in containerization technologies such as Docker.
-
Understanding of fundamental design principles behind a scalable application (caching, Redis)
Opportunity to work with a Silicon Valley based security and governance start-up.
About Privacera
Privacera, Inc is a California based start-up company that is looking for Senior Software Engineers to work out of our Mumbai/Pune based office. Privacera is a cloud-based product which uses Cloud native services in AWS, Azure and GCP. Privacera is a fast-growing start-up and provides ample opportunity work on all Cloud services like AWS S3, DynamoDB, Kinesis, RedShift, EMR, Azure ADLS, HDInsight, GCP GCS, GCP PubSub and other services.
We are looking for motivated individuals who are keen to work on Cloud or Big Data services or have worked on Cloud and Big Data. If you want to work in a start-up culture and are ready for the challenge, then join us on our exciting journey.
Responsibilities:
- Design, code and debug cloud-native applications.
- Evaluate and identify new technologies for implementation
- Determine operational feasibility by evaluating analysis, problem definition, requirements, solution development and proposed solutions
- Write well designed, testable, efficient code
- Develop software verification plans and quality assurance procedures
- Serve as a subject matter expert
Requirements:
-
5+ years of relevant experience in software development
-
Deep understanding of public cloud infrastructure (AWS, Azure or Google)
-
Experience with large scale distributed systems
-
Ability to troubleshoot distributed systems
-
Prior experience with data encryption, TLS/SSL is a strong plus
-
Experience with Docker and Kubernetes is a plus
-
Deep experience with Java
-
Excellent communication (writing, conversation, presentation) skills, consensus building, Quick learner
Good to have experience in Production support - Tier 4
Experience with these technologies are a plus: AWS, Microsoft Azure, Google Cloud, Cloudera, Snowflake, Mongo DB, Oracle, Databricks, Datastax, Confluent








