
1. Core Responsibilities
· Leading solutions for data engineering
· Maintain the integrity of both the design and the data that is held within the architecture
· Champion and educate people in the development and use of data engineering best practises
· Support the Head of Data Engineering and lead by example
· Contribute to the development of database management services and associated processes relating to the delivery of data solutions
· Provide requirements analysis, documentation, development, delivery and maintenance of data platforms.
· Develop database requirements in a structured and logical manner ensuring delivery is aligned with business prioritisation and best practise
· Design and deliver performance enhancements, application migration processes and version upgrades across a pipeline of BI environments.
· Provide support for the scoping and delivery of BI capability to internal users.
· Identify risks and issues and escalate to Line / Project manager.
· Work with clients, existing asset owners & their service providers and non BI development staff to clarify and deliver work stream objectives in timescales that deliver to the overall project expectations.
· Develop and maintain documentation in support of all BI processes.
· Proactively identify cost-justifiable improvements to data manipulation processes.
· Research and promote relevant BI tools and processes that contribute to increased efficiency and capability in support of corporate objectives.
· Promote a culture that embraces change, continuous improvement and a ‘can do’ attitude.
· Demonstrate enthusiasm and self-motivation at all times.
· Establish effective working relationships with other internal teams to drive improved efficiency and effective processes.
· Be a champion for high quality data and use of strategic data repositories, associated relational model, and Data Warehouse for optimising the delivery of accurate, consistent and reliable business intelligence
· Ensure that you fully understand and comply with the organisation’s Risk Management Policies as they relate to your area of responsibility and demonstrate in your day to day work that you put customers at the heart of everything you do.
· Ensure that you fully understand and comply with the organisation’s Data Governance Policies as they relate to your area of responsibility and demonstrate in your day to day work that you treat data as an important corporate asset which must be protected and managed.
· Maintain the company’s compliance standards and ensure timely completion of all mandatory on-line training modules and attestations.
2. Experience Requirements
· 5 years Data Engineering / ETL development experience is essential
· 5 years data design experience in an MI / BI / Analytics environment (Kimball, lake house, data lake) is essential
· 5 years experience of working in a structured Change Management project lifecycle is essential
· Experience of working in a financial services environment is desirable
· Experience of dealing with senior management within a large organisation is desirable
· 5 years experience of developing in conjunction with large complex projects and programmes is desirable
· Experience mentoring other members of the team on best practise and internal standards is essential
· Experience with cloud data platforms desirable (Microsoft Azure) is desirable
3. Knowledge Requirements
· A strong knowledge of business intelligence solutions and an ability to translate this into data solutions for the broader business is essential
· Strong demonstrable knowledge of data warehouse methodologies
· Robust understanding of high level business processes is essential
· Understanding of data migration, including reconciliation, data cleanse and cutover is desirable

About OSBIndia Private Limited
About
Similar jobs
KDK Software, India’s leading tax-tech company, is hiring for its SheConnect initiative—a women-only, remote-first team. This flexible sales role is designed for women seeking true work-life balance. Presently, we’re welcoming 10+ experienced female professionals to take on the role of Business Development Officer and grow with us—on your schedule, from wherever you are.
Apply Now: bit.ly/KDKHR
Requirements:
- 5 Years Experience (3 Years experience in Sales)
- Excellent Negotiation Skills.
- Excellent English communication and understanding.
Perks and benefits:
- Work from home
- 7-Hours Work-Shift (Including Breaks)
- Travel Reimbursement
Remote Locations: Ahmedabad, Ambala, Aurangabad, Bangalore, Chandigarh, Chennai, Coimbatore, Delhi, Faridabad, Ghaziabad, Gurgaon, Hyderabad, Indore, Jaipur, Jodhpur, Kanpur, Kochi, Kolkata, Lucknow, Ludhiana, Mumbai, Muzaffarpur, Nagpur, Noida, Patna, Pune, Rajkot, Ranchi, Surat, Vadodara
- Drive the execution of all product lifecycle processes for the growth of Factory Acquisition product, including product research, market research, competitive analysis, planning, positioning, roadmap development, requirements development, and product launch
- Identify, Plan and execute side Product Hacks to speedup Factory Acquisitions
- Assess current competitor offerings, seeking opportunities for differentiation
- Analyze consumer needs, current market trends, and potential partnerships from an ROI and build vs. buy perspective
- Create product strategy documents that describe business cases, high-level use cases, technical requirements, revenue, and ROI
- Translate product strategy into detailed requirements for prototype construction and final product development by engineering teams
- Build product flows for seamless factory onboarding at scale
- Collaborate closely with engineering, production, marketing, and sales teams on the development, QA, and release of products and balance of resources to ensure success for the entire organization
- Drive the Factory CAC Metric
- Implement, and maintain production timelines across multiple departments
- Appraise new product ideas and strategize appropriate to-market plans
As a Kafka Administrator at Cargill you will work across the full set of data platform technologies spanning on-prem and SAS solutions empowering highly performant modern data centric solutions. Your work will play a critical role in enabling analytical insights and process efficiencies for Cargill’s diverse and complex business environments. You will work in a small team who shares your passion for building, configuring, and supporting platforms while sharing, learning and growing together.
- Develop and recommend improvements to standard and moderately complex application support processes and procedures.
- Review, analyze and prioritize incoming incident tickets and user requests.
- Perform programming, configuration, testing and deployment of fixes or updates for application version releases.
- Implement security processes to protect data integrity and ensure regulatory compliance.
- Keep an open channel of communication with users and respond to standard and moderately complex application support requests and needs.
MINIMUM QUALIFICATIONS
- 2-4 year of minimum experience
- Knowledge of Kafka cluster management, alerting/monitoring, and performance tuning
- Full ecosystem Kafka administration (kafka, zookeeper, kafka-rest, connect)
- Experience implementing Kerberos security
- Preferred:
- Experience in Linux system administration
- Authentication plugin experience such as basic, SSL, and Kerberos
- Production incident support including root cause analysis
- AWS EC2
- Terraform

Designation: Principal Data Engineer
Experience: Experienced
Position Type: Full Time Position
Location: Hyderabad
Office Timings: 9AM to 6PM
Compensation: As Per Industry standards
About Monarch:
At Monarch, we’re leading the digital transformation of farming. Monarch Tractor augments both muscle and mind with fully loaded hardware, software, and service machinery that will spur future generations of farming technologies. With our farmer-first mentality, we are building a smart tractor that will enhance (not replace) the existing farm ecosystem, alleviate labor availability, and cost issues, and provide an avenue for competitive organic and beyond farming by providing mechanical solutions to replace harmful chemical solutions. Despite all the cutting-edge technology we will incorporate, our tractor will still plow, till, and haul better than any other tractor in its class. We have all the necessary ingredients to develop, build and scale the Monarch Tractor and digitally transform farming around the world.
Description:
Monarch Tractor likes to invite an experience Python data engineer to lead our internal data engineering team in India. This is a unique opportunity to work on computer vision AI data pipelines for electric tractors. You will be dealing with data from a farm environment like videos, images, tractor logs, GPS coordinates and map polygons. You will be responsible for collecting data for research and development. For example, this includes setting up ETL data pipelines to extract data from tractors, loading these data into the cloud and recording AI training results.
This role includes, but not limited to, the following tasks:
● Lead data engineering team
● Own and contribute to more than 50% of the data engineering code base
● Scope out new project requirements
● Costing data pipeline solutions
● Create data engineering tooling
● Design custom data structures for efficient processing of data
Data engineering skills we are looking for:
● Able to work with large amounts of text log data, image data, and video data
● Fluently use AWS cloud solutions like S3, Lambda, and EC2
● Able to work with data from Robot Operating System
Required Experience:
● 3 to 5 years of experience using Python
● 3 to 5 years of experience using PostgreSQL
● 3 to 5 years of experience using AWS EC2, S3, Lambda
● 3 to 5 years of experience using Ubuntu OS or WSL
Good to have experience:
● Ray
● Robot Operating System
What you will get:
At Monarch Tractor, you’ll play a key role on a capable, dedicated, high-performing team of rock stars. Our compensation package includes a competitive salary, excellent health, dental and vision benefits, and company equity commensurate with the role you’ll play in our success.


What are we looking for:
● Good problem solving skills.
● Strong knowledge of CS fundamentals, data structures.
● Experience working in software development with one or more of the following programming languages: Java, Go, Scala, Python, C/C++.
● Strong understanding of relational and non-relational databases (MySql, Postgresql, MongoDB, Cassandra).
● Experience in working with distributed caching (Memcached, Redis, or comparable technology).
Experience in working with distributed messaging technologies like RabbitMQ, Kafka etc.
● Ability to design and implement low latency RESTful services.
● Experience with microservices and web application/services development.

Key Responsibilities :
1. Work closely with client and team and understanding the requirement, design, analyze and do the code changes
2.Helping team and client to resolve issues
3.Should able to handle day to day activity like team management, daily meeting
Technical Experience :
- Net development work experience
- Proficiency in NET Development with Net Core.
- Strong object-oriented programming OOPS design skills, SOLID principles and proficiency in software design patterns
- Experience in database MS SQL Server, No SQL
- Solid understanding of Microservices
- Good understanding Azure, Docker and Kubernetes
- Experience with version control systems GitHub and Bitbucket
Professional Attributes :
1.Should have good communication Skills
2.Should be good team player
3.Should be able to work independently and under pressure
4.Should have good logical and analytical thinking

* Optimize components for maximum performance across multiple devices and browsers
* Write performant REST APIs for both internal and external consumption
* Build micro services and their deployment process
* Work with problems of scale, leveraging technologies that are distributed in nature.
* Perform code reviews
- Excellent analytical and problem-solving skills
- Exposure in Python and backend development



Required skill
- Around 6- 8.5 years of experience and around 4+ years in AI / Machine learning space
- Extensive experience in designing large scale machine learning solution for the ML use case, large scale deployments and establishing continues automated improvement / retraining framework.
- Strong experience in Python and Java is required.
- Hands on experience on Scikit-learn, Pandas, NLTK
- Experience in Handling of Timeseries data and associated techniques like Prophet, LSTM
- Experience in Regression, Clustering, classification algorithms
- Extensive experience in buildings traditional Machine Learning SVM, XGBoost, Decision tree and Deep Neural Network models like RNN, Feedforward is required.
- Experience in AutoML like TPOT or other
- Must have strong hands on experience in Deep learning frameworks like Keras, TensorFlow or PyTorch
- Knowledge of Capsule Network or reinforcement learning, SageMaker is a desirable skill
- Understanding of Financial domain is desirable skill
Responsibilities
- Design and implementation of solutions for ML Use cases
- Productionize System and Maintain those
- Lead and implement data acquisition process for ML work
- Learn new methods and model quickly and utilize those in solving use cases

