
About Birla Gold and Precious Metals Private Limited
About
Connect with the team
Company social profiles
Similar jobs

Job Title: Senior System Administrator
Location: Bangalore
Experience Required: 7+ Years
Work Mode:
- 5 Days Working
- Rotational Shifts
- Hybrid Work after probation
Job Description:
We are seeking a Senior System Administrator with 7+ years of hands-on experience in managing Windows Server environments, virtualization technologies, automation tools, and hybrid infrastructure (on-prem & Azure). The ideal candidate should possess strong problem-solving skills, be proficient in scripting, and have experience in Office 365 and Microsoft Intune administration.
Key Responsibilities:
- Manage and maintain Windows Server environments
- Handle virtualization platforms such as Citrix, Nutanix, VMware, Hyper-V
- Implement and maintain automation using tools like Ansible, Terraform, PowerShell
- Work with Infrastructure as Code (IaC) platforms and DevOps frameworks
- Support and manage Office 365 and Microsoft Intune
- Monitor and support Data Center Operations (DCO) and NOC
- Ensure security and compliance across systems
- Provide scripting and troubleshooting support for infrastructure automation
- Collaborate with teams for CI/CD pipeline integration
- Handle monitoring, backup, and disaster recovery processes
- Work effectively in a hybrid environment (on-prem and Azure)
Skills Required:
- Office 365 Administration
- Microsoft Intune Administration
- Security & Compliance
- Automation & Infrastructure as Code (IaC)
- Tools: PowerShell, Terraform, Ansible
- CI/CD and DevOps framework exposure
- Monitoring & Backup
- Data Center Operations (DCO) & NOC Support
- Hybrid environment experience (on-prem and Azure)
- Scripting & Troubleshooting
- PowerShell scripting for automation
Publicis Sapient Overview:
The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
.
Job Summary:
As Senior Associate L1 in Data Engineering, you will do technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. Having hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms will be preferable.
Role & Responsibilities:
Job Title: Senior Associate L1 – Data Engineering
Your role is focused on Design, Development and delivery of solutions involving:
• Data Ingestion, Integration and Transformation
• Data Storage and Computation Frameworks, Performance Optimizations
• Analytics & Visualizations
• Infrastructure & Cloud Computing
• Data Management Platforms
• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time
• Build functionality for data analytics, search and aggregation
Experience Guidelines:
Mandatory Experience and Competencies:
# Competency
1.Overall 3.5+ years of IT experience with 1.5+ years in Data related technologies
2.Minimum 1.5 years of experience in Big Data technologies
3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline. Working knowledge on real-time data pipelines is added advantage.
4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable
5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc
Preferred Experience and Knowledge (Good to Have):
# Competency
1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience
2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc
3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures
4.Performance tuning and optimization of data pipelines
5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality
6.Working knowledge with data platform related services on at least 1 cloud platform, IAM and data security
7.Cloud data specialty and other related Big data technology certifications
Job Title: Senior Associate L1 – Data Engineering
Personal Attributes:
• Strong written and verbal communication skills
• Articulation skills
• Good team player
• Self-starter who requires minimal oversight
• Ability to prioritize and manage multiple tasks
• Process orientation and the ability to define and set up processes
Sr. Data Engineer
Company Profile:
Bigfoot Retail Solutions [Shiprocket] is a logistics platform which connects Indian eCommerce SMBs with logistics players to enable end to end solutions.
Our innovative data backed platform drives logistics efficiency, helps reduce cost, increases sales throughput by reducing RTO and improves post order customer engagement and experience.
Our vision is to power all logistics for the direct commerce market in India
including first mile, linehaul, last mile, warehousing, cross border and O2O.
Position: Sr.Data Engineer
Team : Business Intelligence
Location: New Delhi
Job Description:
We are looking for a savvy Data Engineer to join our growing team of analytics experts. The hire will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.
Key Responsibilities:
- Create and maintain optimal data pipeline architecture.
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through multiple data centres and AWS regions.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
Qualifications for Data Engineer
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.


- Proven experience as a Full Stack Developer or similar role
- Experience developing desktop and mobile applications
- Knowledge of multiple front-end languages and libraries (e.g. HTML/ CSS, JavaScript, XML, jQuery)
- Minimum 1 years of experiance in ROR or Angular.
- Excellent communication and teamwork skills
- Great attention to detail
- An analytical mind


We are looking for a Senior Python Developer to join our engineering team and help us develop and maintain various software products. Your primary focus will be the development of all server-side logic, ensuring high performance and responsiveness to requests from the front-end.
Python Developer responsibilities include designing product, writing, and testing code, debugging programs, and integrating applications with web services.
Skills And Qualifications
• Expert in Python, with knowledge of multithreading and multitasking framework
• Expert in PySpark and Spark Framework
• Expert in Docker and Container Environment.
• Strong unit test and debugging skills
• Knowledge on MySql, Hive, Hadoop and NoSQL Databases
• Good Understanding of event-driven programming in Python
• Hands on experience on Kubernetes and container management using Kubernetes on Azure Cloud
• Hands on experience on Azure Cloud Load Balancer and VPC networking
• Good Understanding of Azure Cloud services
• Knowledge of user authentication and authorization between multiple systems, servers, and environments. eg. Single Sign On (SSO)
• Proficient understanding of code versioning tools as Git
• Understanding of accessibility and security compliance
• Implement security and data protection solutions
• Coordinate with internal teams to understand user requirements and provide technical solutions



You have:
- 4- 15 years of relevant experience in full stack development in a reputed B2B SAAS company, preferably contract lifecycle management company (e.g. Pandadoc, Ironclad, Docusign, Zoho, Cloudcherry, Spot Draft etc)
- Experience in the following tech stack: JavaScript, Typescript, Angular, Python, Django, REST API, MySQL, Google Cloud
- Solid understanding of RDBMS concepts and experience managing one database (MySQL, PostgresSQL, etc)
- Solid understanding of Google Cloud Platform (or AWS)
- Strong experience and/or expertise in building customized text editors
- Strong experience with Python data analysis libraries
- Experience and/or expertise in building No-Code platforms
- Experience leading a small team of engineers in an agile/scrum setup
Preferred additional qualifications:
- Experience in building integrations with third party tools.
- Automated testing and data quality assurance experience
- Experience in data analytics
- Experience in working with data scientist to build and/or deploy ML models to solve specific problems
- Experience in working with Microsoft Cloud Storage Partner Program (CSPP)
- Experience in a startup environment
- Experience in Blockchain and Web3 technologies

Almost a decade old, it is a venture committed to bring together a varied range of traditional crafts and techniques of dyeing, weaving, printing and hand embroidery. The founders have dedicated their life to promote Indian Block Prints and provide employment and Hand-Embroidery training to women so that numerous underprivileged women can be empowered.
What you will do:
- Developing analytical solutions to solve problem, identifying internal and external data, using analytical techniques including use of statistical techniques to analyse the problem, distilling information into actionable insights
- Participating in key decision-making forums and communicating key insights in an effective and influential manner
- Partnering with key stakeholders prioritizing analytics roadmap, demonstrating sense of urgency to identify and acting on opportunity and driving transparency on the work roadmap
- Monitoring and measuring launched initiatives [A/B testing] and feed learnings back into development process
- Managing multiple project priorities, deadlines and deliverables including rapid response as well as strategic priorities
- Identifying the appropriate data sources and building data assets to be able to enrich data and expand analytics capabilities
- Converting frequently asked questions into reports/ diagnostic tools
What you need to have:
- Graduate/ Post-Graduate Degree in Engineering, Mathematics, Statistics
- 8+ year of well-rounded analytics experience, preferably in internet B2C- Retail
- Strong exposure to data, analytical framework, hypothesis-based problem solving and scaling analytics
- Ability to work independently and drive your own projects
- Strong problem-solving skills
- Ability to translate business problem to analytics
- Expertise in data wrangling using SQL, exposure to Python is a plus
- Hands-on experience in using excel and power point
- Excellent communication skills
- Entrepreneurial mind-set, strong interpersonal skills
- Strong collaboration skills - inter and intra team

Get To Know Us First :
Our founding team consists of people from ISB, NSIT, IITs and NITs with very strong past entrepreneurial background and F&B experience. LimeTray counts very successful internet entrepreneurs as its angel investors along with Matrix Partners and JSW Ventures. Our story starts with the realization that many try to walk a mile in the shoes of a restaurant,
even if branded, get ripped so soon. So, we at LimeTray, decided to help restaurants in their roller coaster rides, by building products that would help them to ride better. We help restaurants get online, help them engage with their customers and help them optimize their
operations through a suite of products. In a span of 5 years, we have built successful relationships with some of the top brands in India including Burger King, Krispy Kreme, Mad Over Donuts, Bercos and over 2500 others!
What Are We Looking For:
We are looking for a software engineer who can work with our client vertical to engage consumers in our products. The ideal candidate is someone who is an expert in designing reliable services with well-defined APIs that fit into LimeTray service oriented architecture and also possess experience in designing, developing, Implementing and scaling web applications for our rapidly growing Technology team.
While every product-line has their own tech stack - different products have different technologies and it's expected that Person is comfortable working across all of them as and when needed. Some of the technologies/frameworks that we work on – Microservices Architecture, Java, Spring, Hibernate, Node.js, MySQL, MongoDB, Angular, React,
Kubernetes, AWS, Python.
How Will Your Day Look Like:
Develop interactive, user-friendly applications using the latest frameworks which are open
source and proprietary.
Contribute reusable components to the LimeTray UI Development community.
Optimize application for maximum speed and scalability.
Develop Responsive UIs leveraging Walmarts and Open source state of the art frameworks.
Help the team leverage and contribute to open source technologies.
Collaborate with other team members and stakeholders.
Contribute to code review and mentoring initiatives.
Make an impact on a global scale.
You Are Good At :
Bachelor’s / Masters degree in computer science.
3+ years of work experience.
You would possess experience building Javascript web applications.
Solid fundamentals in Data structures and Algorithms.
Good debugging and problem solving capability.
Proficiency to code in front-end technologies.
Object oriented Javascript, React, Redux, NodeJS, EcmaScript 6+.
Atleast one full lifecycle web application built with React + Redux.
Webpack, JSON based web-services consumption.
AngularJS 1.x, J2EE knowledge is a plus.
Understanding how to engineer the RESTful.
Micro services and knowledge of SOARI architecture.
Deep knowledge of web technologies such as HTML5, CSS (SCSS, LESS), JSON.
What You’ll Love About Us :
Startup Culture: You will work in an environment that values creative problem solving, encourages open communication and worships flat hierarchy.
Rest And Relaxation: 2 Work From Home every month, 25 paid holidays
Health Benefits : Health Insurance Policy governed by Religare Health Group covering all our employees
CANDIDATE WILL BE DEPLOYED IN A FINANCIAL CAPTIVE ORGANIZATION @ PUNE (KHARADI)
Below are the job Details :-
Experience 10 to 18 years
Mandatory skills –
- data migration,
- data flow
The ideal candidate for this role will have the below experience and qualifications:
- Experience of building a range of Services in a Cloud Service provider (ideally GCP)
- Hands-on design and development of Google Cloud Platform (GCP), across a wide range of GCP services including hands on experience of GCP storage & database technologies.
- Hands-on experience in architecting, designing or implementing solutions on GCP, K8s, and other Google technologies. Security and Compliance, e.g. IAM and cloud compliance/auditing/monitoring tools
- Desired Skills within the GCP stack - Cloud Run, GKE, Serverless, Cloud Functions, Vision API, DLP, Data Flow, Data Fusion
- Prior experience of migrating on-prem applications to cloud environments. Knowledge and hands on experience on Stackdriver, pub-sub, VPC, Subnets, route tables, Load balancers, firewalls both for on premise and the GCP.
- Integrate, configure, deploy and manage centrally provided common cloud services (e.g. IAM, networking, logging, Operating systems, Containers.)
- Manage SDN in GCP Knowledge and experience of DevOps technologies around Continuous Integration & Delivery in GCP using Jenkins.
- Hands on experience of Terraform, Kubernetes, Docker, Stackdriver, Terraform
- Programming experience in one or more of the following languages: Python, Ruby, Java, JavaScript, Go, Groovy, Scala
- Knowledge or experience in DevOps tooling such as Jenkins, Git, Ansible, Splunk, Jira or Confluence, AppD, Docker, Kubernetes
- Act as a consultant and subject matter expert for internal teams to resolve technical deployment obstacles, improve product's vision. Ensure compliance with centrally defined Security
- Financial experience is preferred
- Ability to learn new technologies and rapidly prototype newer concepts
- Top-down thinker, excellent communicator, and great problem solver
Exp:- 10 to 18 years
Location:- Pune
Candidate must have experience in below.
- GCP Data Platform
- Data Processing:- Data Flow, Data Prep, Data Fusion
- Data Storage:- Big Query, Cloud Sql,
- Pub Sub, GCS Bucket



