

Similar jobs

JOB DETAILS:
* Job Title: Lead II - Software Engineering - AWS, Apache Spark (PySpark/Scala), Apache Kafka
* Industry: Global digital transformation solutions provider
* Salary: Best in Industry
* Experience: 5-8 years
* Location: Hyderabad
Job Summary
We are seeking a skilled Data Engineer to design, build, and optimize scalable data pipelines and cloud-based data platforms. The role involves working with large-scale batch and real-time data processing systems, collaborating with cross-functional teams, and ensuring data reliability, security, and performance across the data lifecycle.
Key Responsibilities
ETL Pipeline Development & Optimization
- Design, develop, and maintain complex end-to-end ETL pipelines for large-scale data ingestion and processing.
- Optimize data pipelines for performance, scalability, fault tolerance, and reliability.
Big Data Processing
- Develop and optimize batch and real-time data processing solutions using Apache Spark (PySpark/Scala) and Apache Kafka.
- Ensure fault-tolerant, scalable, and high-performance data processing systems.
Cloud Infrastructure Development
- Build and manage scalable, cloud-native data infrastructure on AWS.
- Design resilient and cost-efficient data pipelines adaptable to varying data volume and formats.
Real-Time & Batch Data Integration
- Enable seamless ingestion and processing of real-time streaming and batch data sources (e.g., AWS MSK).
- Ensure consistency, data quality, and a unified view across multiple data sources and formats.
Data Analysis & Insights
- Partner with business teams and data scientists to understand data requirements.
- Perform in-depth data analysis to identify trends, patterns, and anomalies.
- Deliver high-quality datasets and present actionable insights to stakeholders.
CI/CD & Automation
- Implement and maintain CI/CD pipelines using Jenkins or similar tools.
- Automate testing, deployment, and monitoring to ensure smooth production releases.
Data Security & Compliance
- Collaborate with security teams to ensure compliance with organizational and regulatory standards (e.g., GDPR, HIPAA).
- Implement data governance practices ensuring data integrity, security, and traceability.
Troubleshooting & Performance Tuning
- Identify and resolve performance bottlenecks in data pipelines.
- Apply best practices for monitoring, tuning, and optimizing data ingestion and storage.
Collaboration & Cross-Functional Work
- Work closely with engineers, data scientists, product managers, and business stakeholders.
- Participate in agile ceremonies, sprint planning, and architectural discussions.
Skills & Qualifications
Mandatory (Must-Have) Skills
- AWS Expertise
- Hands-on experience with AWS Big Data services such as EMR, Managed Apache Airflow, Glue, S3, DMS, MSK, and EC2.
- Strong understanding of cloud-native data architectures.
- Big Data Technologies
- Proficiency in PySpark or Scala Spark and SQL for large-scale data transformation and analysis.
- Experience with Apache Spark and Apache Kafka in production environments.
- Data Frameworks
- Strong knowledge of Spark DataFrames and Datasets.
- ETL Pipeline Development
- Proven experience in building scalable and reliable ETL pipelines for both batch and real-time data processing.
- Database Modeling & Data Warehousing
- Expertise in designing scalable data models for OLAP and OLTP systems.
- Data Analysis & Insights
- Ability to perform complex data analysis and extract actionable business insights.
- Strong analytical and problem-solving skills with a data-driven mindset.
- CI/CD & Automation
- Basic to intermediate experience with CI/CD pipelines using Jenkins or similar tools.
- Familiarity with automated testing and deployment workflows.
Good-to-Have (Preferred) Skills
- Knowledge of Java for data processing applications.
- Experience with NoSQL databases (e.g., DynamoDB, Cassandra, MongoDB).
- Familiarity with data governance frameworks and compliance tooling.
- Experience with monitoring and observability tools such as AWS CloudWatch, Splunk, or Dynatrace.
- Exposure to cost optimization strategies for large-scale cloud data platforms.
Skills: big data, scala spark, apache spark, ETL pipeline development
******
Notice period - 0 to 15 days only
Job stability is mandatory
Location: Hyderabad
Note: If a candidate is a short joiner, based in Hyderabad, and fits within the approved budget, we will proceed with an offer
F2F Interview: 14th Feb 2026
3 days in office, Hybrid model.
⚙️ Operations Internship (3 Months)
Location: [ “Remote”]
Duration: 3 Months
Stipend: ₹5,000 – ₹7,000 per month
Internship Type: Remote
About the Role
We are looking for a proactive and detail-oriented Operations Intern to join our team. This internship offers a great opportunity to gain hands-on experience in business operations, process management, and coordination.
You’ll work closely with the operations team to ensure smooth day-to-day activities and support multiple departments to enhance efficiency and productivity.
Key Responsibilities
- Assist in daily operational activities and process coordination
- Support in data entry, tracking, and report preparation
- Coordinate with internal teams to ensure smooth workflow and communication
- Help in inventory management / vendor coordination / logistics (as applicable)
- Maintain records and documentation accurately
- Identify and suggest process improvements to enhance efficiency
- Support project planning and execution activities
- Perform other administrative and operational tasks as assigned
Required Skills
- Strong organizational and multitasking abilities
- Excellent communication and coordination skills
- Basic knowledge of MS Office (Excel, Word, PowerPoint)
- Problem-solving and analytical mindset
- Attention to detail and a proactive attitude
- Ability to work in a team environment
Eligibility
- Students / Recent graduates in Management, Operations, Business Administration, or related fields
- Available for a 3-month full-time internship
- Immediate joiners preferred
Perks & Benefits
- Internship Certificate
- Letter of Recommendation (based on performance)
- Exposure to real-world business operations
- Opportunity for a full-time role after internship
- 4-8 years of experience in Functional testing with good foundation in technical expertise
- Experience in the Capital Markets domain is MUST
- Exposure to API testing tools like SoapUI and Postman
- Well versed with SQL
- Hands on experience in debugging issues using Unix commands
- Basic understanding of XML and JSON structures
- Knowledge of FitNesse is good to have
- Should be early joinee.
About the Role
We are looking to hire a talented iOS developer to design, build, and maintain the next generation of iOS applications. Your primary focus will be developing high-end iOS applications for the latest Apple mobile devices. Your duties may include collaborating with the product team and Tech team for new application features, identifying and fixing application bottlenecks, maintaining the core code, and updating applications published on the App Store.
To ensure success as an iOS developer, you should have a strong working knowledge of iOS Frameworks, be proficient in Swift, and be able to work as part of a team. Ultimately, an outstanding iOS developer should be able to create functional, attractive applications that perfectly meet the needs of the user.
Job Responsibilities
- Proficient in Swift, and Objective-C
- Design and implement new features, improvements, ensuring high performance, responsiveness, and a seamless user experience
- Providing implementation detail of feature and documentation
- Experience with iOS frameworks such as UIKit, SwiftUI
- Familiarity with RESTful APIs and networking libraries
- Knowledge of version control systems (e.g., Git)
- Identifying potential problems and resolving application bottlenecks.
- Maintain and enhance the existing codebase while adhering to best practices and coding standards.
- Understanding of Apple’s design principles and interface guidelines
- Developing and implementing architecture to support user interface concepts.
- familiarity with App Approval and Submission procedure
- Knowledge of Push notifications
- Implement secure coding practices to ensure data protection and user privacy
Experience & Skills Requirements
- Bachelor’s degree in computer science or software engineering.
- Minimum 3 years of experience in iOS development.
- Proficient in Objective-C, SwiftUI, and Cocoa Touch.
- Extensive experience with iOS Frameworks such as Core Data and Core Animation.
- Knowledge of iOS back-end services.
- Knowledge of Apple’s design principles and application interface guidelines.
- Proficient in code versioning tools including Mercurial, Git, and SVN.
- Knowledge of C-based libraries.
- Familiarity with push notifications, APIs, and cloud messaging.
- Experience with continuous integration.
What you’ll be doing
Weare much more than our job descriptions, but here is where you will begin:
As a Senior Software Engineer Data & ML You’ll Be:
● Architect, design, test, implement, deploy, monitor and maintain end-to-end backend
services. You build it, you own it.
● Work with people from other teams and departments on a day to day basis to ensure
efficient project execution with a focus on delivering value to our members.
● Regularly aligning your team’s vision and roadmap with the target architecture within your
domain and to ensure the success of complex multi domain initiatives.
● Integrate already trained ML and GenAI models (preferably GCP in services.
ROLE:
Whatyou’ll need,
Like us, you’ll be deeply committed to delivering impactful outcomes for customers.
What Makes You a Great Fit
● 5 years of proven work experience as a Backend Python Engineer
● Understanding of software engineering fundamentals OOPS, SOLID, etc.)
● Hands-on experience with Python libraries like Pandas, NumPy, Scikit-learn,
Lang chain/LLamaIndex etc.
● Experience with machine learning frameworks such as PyTorch or TensorFlow, Keras, being
proficient in Python
● Hands-on Experience with frameworks such as Django or FastAPI or Flask
● Hands-on experience with MySQL, MongoDB, Redis and BigQuery (or equivalents)
● Extensive experience integrating with or creating REST APIs
● Experience with creating and maintaining CI/CD pipelines- we use GitHub Actions.
● Experience with event-driven architectures like Kafka, RabbitMq or equivalents.
● Knowledge about:
o LLMs
o Vector stores/databases
o PromptEngineering
o Embeddings and their implementations
● Somehands-onexperience in implementations of the above ML/AI will be preferred
● Experience with GCP/AWS services.
● You are curious about and motivated by the future trends in data, AI/ML, analytics
Key Responsibilities:
• Manage the day-to-day accounting operations of the e-commerce business, including accounts payable and receivable, general ledger entries, bank reconciliations, and payroll processing
• Ensure timely and accurate recording of financial transactions in accordance with accounting principles and regulations
• Perform monthly and year-end closing procedures, including preparing financial statements and reports, such as profit and loss statements and balance sheets
• Reconcile financial discrepancies by collecting and analysing account information
• Work closely with the e-commerce team to provide financial insights and analysis for decision-making
• Develop and implement financial controls to ensure compliance with company policies and procedures
• Manage tax filings, including sales tax, use tax, and income tax
• Assist with budgeting and forecasting processes, as well as financial modelling and analysis
• Collaborate with external auditors to provide necessary documentation and support for audits
• Identify and implement cost-saving initiatives.
Qualifications: • Bachelor's degree in Accounting, Finance or related field
• 3+ years of experience in accounting, preferably in the e-commerce industry
• CPA or CMA certification preferred
• Excellent knowledge of accounting principles and regulations
• Experience with accounting software and e-commerce platforms
• Strong analytical and problem-solving skills
• Ability to work collaboratively with cross-functional teams
• Attention to detail and accuracy
• Excellent communication skills
• Proficient in Microsoft Office and accounting software
Reconciliation:
- Bank Reconciliation: Reconcile the bank statements with the company's cash book to ensure that all transactions are recorded accurately.
- Accounts Receivable Reconciliation: Reconcile the accounts receivable ledger with the general ledger to ensure that all accounts are accurate and up to date.
- Accounts Payable Reconciliation: Reconcile the accounts payable ledger with the general ledger to ensure that all vendor accounts are accurate and up to date.
- Intercompany Reconciliation: Reconcile transactions between different entities within the same company to ensure that all intercompany accounts are accurate and up to date.
- Payroll Reconciliation: Reconcile payroll records with bank statements to ensure that all payroll transactions are recorded accurately.
- General Ledger Reconciliation: Reconcile reconcile the general ledger with subsidiary ledgers to ensure that all accounts are accurate and up to date.
- E-commerce Platforms Reconciliation: Reconcile all the entries of all B2B and B2C e-commerce platforms.
• Testing and Quality Control of Transformer.
• Servicing of all our products.
• Inward Material Inspection.
• PCB Testing.
• Independent Inspection Handing.
• Modification / New Development of procedures – Products.
•Knowledge in the products Electrical /Electronic field.
•Self Correspondence - email drafting, co-ordination
Description
We are a dynamic UK-based technology company that is fundamentally changing the way international logistics operates. We’re searching for a full-stack developer who is excited by the prospect of working at the bleeding edge of high tech in a rapidly growing scale-up. As we establish a global presence, we're expanding our team in India at a pace and looking for fantastic engineers to join us. We've recently raised our Series A round from leading US investor https://bvp.com/" target="_blank">Bessemer Venture Partners (LinkedIn, Twilio, Shopify) alongside https://episode1.com/" target="_blank">Episode 1 (Zoopla, Betfair, Shazam) and supply chain-focused fund https://www.dynamo.vc/" target="_blank">Dynamo Ventures (Sennder, Stord).
You will build and maintain our public-facing application. The role will mainly involve developing real-time frontend and backend services, while frequently interacting with a multi-cloud environment. You will build a customer-facing product deeply connected to the ML components of the system, solving new problems regarding how users interact with the results and how they re-train the models through their input.
Specifically, we want someone who can:
- Solve problems from understanding the available data to providing the functional UI requirements
- Architect solid frontend components, including snapshot testing and integration testing
- Develop and maintain APIs and data structures for ML-powered features
- Use cloud-native libraries, in a multi-cloud system
- Work with the DevOps to build CI/CD pipelines and help manage the infrastructure
Requirements
- Ability to develop user interfaces following HTML usability best practices
- Extensive knowledge of modern front-end development, using React with Hooks and CSS-in-JS
- Capable to build scalable APIs using the most recent Python features
- Extensive experience with SQL and NoSQL databases
- Have worked commercially with containerized tools like Docker, Docker compose, Kubernetes, etc.
- Experience that would be good to have:
- 2 - 3 years of commercial experience
- Ability to sensibly design features from a UI and UX perspective
- Experience with real-time interfaces and backends
Job title: Azure Architect
Locations: Noida, Pune, Bangalore and Mumbai
Responsibilities:
- Develop and maintain scalable architecture, database design and data pipelines and build out new Data Source integrations to support continuing increases in data volume and complexity
- Design and Develop the Data lake, Data warehouse using Azure Cloud Services
- Assist in designing end to end data and Analytics solution architecture and perform POCs within Azure
- Drive the design, sizing, POC setup, etc. of Azure environments and related services for the use cases and the solutions
- Reviews the solution requirements support architecture design to ensure the selection of appropriate technology, efficient use of resources and integration of multiple systems and technology.
- Must possess good client-facing experience with the ability to facilitate requirements sessions and lead teams
- Support internal presentations to technical and business teams
- Provide technical guidance, mentoring and code review, design level technical best practices
Experience Needed:
- 12-15 years of industry experience and at least 3 years of experience in architect role is required along with at least 3 to 4 years’ experience designing and building analytics solutions in Azure.
- Experience in architecting data ingestion/integration frameworks capable of processing structured, semi-structured & unstructured data sets in batch & real-time
- Hands-on experience in the design of reporting schemas, data marts and development of reporting solutions
- Develop batch processing, streaming and integration solutions and process Structured and Non-Structured Data
- Demonstrated experience with ETL development both on-premises and in the cloud using SSIS, Data Factory, and Azure Analysis Services and other ETL technologies.
- Experience in Perform Design, Development & Deployment using Azure Services ( Azure Synapse, Data Factory, Azure Data Lake Storage, Databricks, Python and SSIS)
- Worked with transactional, temporal, time series, and structured and unstructured data.
- Deep understanding of the operational dependencies of applications, networks, systems, security, and policy (both on-premise and in the cloud; VMs, Networking, VPN (Express Route), Active Directory, Storage (Blob, etc.), Windows/Linux).
Mandatory Skills: Azure Synapse, Data Factory, Azure Data Lake Storage, Azure DW, Databricks, Python
Job Description :
- Should have handson experience of more than 1+ years
- Good knowledge of NodeJS, ExpressJS or Restify
- Should have good knowledge of JavaScript.
- Experience implementing frontend using Angular 2+.
- Any database knowledge (SQL.Mongo or NoSQL)
- Basic understanding of version control using GIT
- TypeScript knowledge would be a plus
- Should have good logical skills.










