Responsibilities:
* 3+ years of Data Engineering Experience - Design, develop, deliver and maintain data infrastructures.
* SQL Specialist – Strong knowledge and Seasoned experience with SQL Queries
* Languages: Python
* Good communicator, shows initiative, works well with stakeholders.
* Experience working closely with Data Analysts and provide the data they need and guide them on the issues.
* Solid ETL experience and Hadoop/Hive/Pyspark/Presto/ SparkSQL
* Solid communication and articulation skills
* Able to handle stakeholders independently with less interventions of reporting manager.
* Develop strategies to solve problems in logical yet creative ways.
* Create custom reports and presentations accompanied by strong data visualization and storytelling
We would be excited if you have:
* Excellent communication and interpersonal skills
* Ability to meet deadlines and manage project delivery
* Excellent report-writing and presentation skills
* Critical thinking and problem-solving capabilities
About Indium Software
Similar jobs
Your Day-to-Day
- Derive Insights and drive major strategic projects to improve Business Metrics and take responsibility for cost efficiency and Revenue management across the country
- Perform Market research, Post Mortem analyses on competitor expansion and Market Penetration patterns.
- Provide in-depth business analysis and data insights for internal stakeholders to help improve business. Derive and launch projects in order to reduce the gaps between targeted and projected business metrics
- Responsible for optimizing Carsome’s C2B and B2C customer acquisition and Dealer retention funnel. Work closely with Marketing and Tech teams to create, produce and implement creative digital marketing campaigns and drive CRM initiatives and strategies
- Analyse the Revenue flows and processes large datasets to gather process insights and propose process improvement ideas for Carsome across SE-Asia
- Lead commercial projects & process mapping, from conceptualization to completion, to build or re-engineer business models, tools and processes.
- Having experience in analyses and insights in dealing on Unit Economics, COGs and P&L will be preferred ,but not mandatory
- Use Business Intelligence and Data Science tools to answer the appropriate business problems using SQL, Tableau or Python.
- Coordinate with HQ Data Insights Team and manage internal stakeholders across departments to ensure the smooth delivery of strategic projects
- Work across different departments/functions (BI,DE, tech, pricing, finance, operations, marketing, CS,CX) and also on high impact projects and support business expansion initiatives
Your Know-Know
- At least a Bachelor's Degree in Accounting/Finance/Business or the equivalent.
- 3-5 years of experience in strategy / consulting / analytical / project management roles; experience in e-commerce, Start-ups or Unicorns(CARS24,OLA,SWIGGY,FLIPKART,OYO) or entrepreneur experience preferred + At Least 2 years of experience leading a team
- Top-notch academics from a Tier 1 college (IIM / IIT/ NIT)
- Must have SQL/PostgreSQL/Tableau Experience.
- Excellent Market Research, reporting and analytical skills, including carrying out weekly and monthly reporting
- Holds experience in working with Data/Business Intelligence Team
- Analytical mindset with ability to present data in a structured and informative way
- Enjoy a fast-paced environment and can align business objectives with product priorities
- Good to have : Financial modelling, Developing financial forecasts , development of Financial - strategic plan/framework
Require Someone skilled in python / C/C++ to work on new products and also support existing AI based products .
Should be open to learning new frameworks
Deep-Rooted.Co is on a mission to get Fresh, Clean, Community (Local farmer) produce from harvest to reach your home with a promise of quality first! Our values are rooted in trust, convenience, and dependability, with a bunch of learning & fun thrown in.
Founded out of Bangalore by Arvind, Avinash, Guru and Santosh, with the support of our Investors Accel, Omnivore & Mayfield, we raised $7.5 million in Seed, Series A and Debt funding till date from investors include ACCEL, Omnivore, Mayfield among others. Our brand Deep-Rooted.Co which was launched in August 2020 was the first of its kind as India’s Fruits & Vegetables (F&V) which is present in Bangalore & Hyderabad and on a journey of expansion to newer cities which will be managed seamlessly through Tech platform that has been designed and built to transform the Agri-Tech sector.
Deep-Rooted.Co is committed to building a diverse and inclusive workplace and is an equal-opportunity employer.
How is this possible? It’s because we work with smart people. We are looking for Engineers in Bangalore to work with thehttps://www.linkedin.com/in/gururajsrao/"> Product Leader (Founder) andhttps://www.linkedin.com/in/sriki77/"> CTO and this is a meaningful project for us and we are sure you will love the project as it touches everyday life and is fun. This will be a virtual consultation.
We want to start the conversation about the project we have for you, but before that, we want to connect with you to know what’s on your mind. Do drop a note sharing your mobile number and letting us know when we can catch up.
Purpose of the role:
* As a startup we have data distributed all across various sources like Excel, Google Sheets, Databases etc. We need swift decision making based a on a lot of data that exists as we grow. You help us bring together all this data and put it in a data model that can be used in business decision making. * Handle nuances of Excel and Google Sheets API. * Pull data in and manage it growth, freshness and correctness. * Transform data in a format that aids easy decision-making for Product, Marketing and Business Heads. * Understand the business problem, solve the same using the technology and take it to production - no hand offs - full path to production is yours.
Technical expertise:
* Good Knowledge And Experience with Programming languages - Java, SQL,Python. * Good Knowledge of Data Warehousing, Data Architecture. * Experience with Data Transformations and ETL; * Experience with API tools and more closed systems like Excel, Google Sheets etc. * Experience AWS Cloud Platform and Lambda * Experience with distributed data processing tools. * Experiences with container-based deployments on cloud.
Skills:
Java, SQL, Python, Data Build Tool, Lambda, HTTP, Rest API, Extract Transform Load.
- Design & Development of the architecture for multi-robot planning.
- Design & development of task allocation algorithms
- Design & development of conflict resolution approaches.
- Design & develop queuing strategies for multi-robot deployments.
- Research & collaboration on approaches to improve task allocation based on historical data
- Design & development of communication architecture for inter-robot and robot-server communications.
Requirements - B.Tech, M. Tech or higher qualification in Computer Science Engineering, Information
- Technology or related fields
- Proficiency with C++/Python programming language
- Experience of working in the robotics field for 3-5 years
- Skilled at general software development, bug analysis, and fixing
- Knowledge of networking/communication concepts
- Strong knowledge of Robot Operating System (ROS)
- Good knowledge of system design
- Excellent problem-solving skills
- Good project management skills
- Excellent verbal communication skills
- Good interpersonal skill
• Involved in entire SDLC lifecycle including analysis, development, fixing and monitoring of issues on the assigned product lines.
• Meets and exceeds standards for the quality and timeliness of the work products.
• Implements, unit tests, debugs and leads integrations of complex code.
• Identify opportunities for further enhancements and refinements to best practices, standards and processes. • Ensure robust, securely accessible, highly available and highly scalable product that meets or exceeds customer and end-user expectations
Experience 3 – 6 Years
Technical Duties & Responsibilities
With 2-4 years of experience in Scalable Architecture development. We are looking for Independent Contributors, who have good understanding of Microservices based architecture, and a comprehensive awareness of various architectures & their suitability as per product requirements: -
• Strong knowledge of microservices based architecture model.
• Good knowledge of messaging frameworks like RabbitMQ. Prefer a candidate who had earlier used messaging for a chat type solution or developed high transaction messaging queues.
• Should have experience in Elastic Search and Kibana. • Good understanding of RESTful architecture, database technologies (both SQL - and NoSQL).
• Ability to understand and solve performance issues and constraints.
• Should have understanding of scaling applications, and know the bottlenecks involved in resource optimization.
• Proficient in .Net framework and its languages with also good programming knowledge of JavaScript.
• Proficient in MS concepts like Asp.net, C#, MVC, webservices (development and consumption), API development
• Should have experience in Agile, iterative and Scrum based projects.
• Should have worked in a Test Driven Development (TDD) environment. Good knowledge of Object Oriented programming, Design Principles like SOLID, and Gang of Four Design patterns.
- Use data to develop machine learning models that optimize decision making in Credit Risk, Fraud, Marketing, and Operations
- Implement data pipelines, new features, and algorithms that are critical to our production models
- Create scalable strategies to deploy and execute your models
- Write well designed, testable, efficient code
- Identify valuable data sources and automate collection processes.
- Undertake to preprocess of structured and unstructured data.
- Analyze large amounts of information to discover trends and patterns.
Requirements:
- 1+ years of experience in applied data science or engineering with a focus on machine learning
- Python expertise with good knowledge of machine learning libraries, tools, techniques, and frameworks (e.g. pandas, sklearn, xgboost, lightgbm, logistic regression, random forest classifier, gradient boosting regressor etc)
- strong quantitative and programming skills with a product-driven sensibility
Job Title: |
Senior Cloud Service DBA |
||
Department & Team |
Technology |
Location: |
INDIA |
Reporting To: |
Senior DBA – Team Lead |
Role Purpose: |
This role will be a mix of project, BAU and innovation and will encompass typical database administration duties that you may expect in a large-scale telecommunications environment: performance tuning, replication, database design, backups, high availability, encryption, security, configuration etc.
1. Network Services Team – Responsible for IP Network and its associated components 2. Infrastructure Team – Responsible for Server and Storage systems 3. Database Services Team – Responsible for all Databases 4. Cloud Architect Team – Delivering future strategy, ongoing cloud performance optimisation.
The DBA function forms part of Client’s Tier 3 support function and works closely with the internal NOC, Service Desk, Infrastructure, IP Networks and Cloud Architect teams. To enable the business in achieving its stated objectives by assisting the other technology teams to achieve world-class benchmarks of customer service and support.
To highlight, a key requirement of the role will be involvement in defining our future strategy around database modernisation in the cloud.
|
Responsibilities: |
Operations · Involved in new solutions design discussions and to recommend suitable, secure, performance optimised database offerings based on business requirements · Ensure all databases in the AWS and Azure are configured for performance, scale and high availability where required · Take responsibility of modernisation of Clients database estate in the cloud, leveraging open source technologies and cloud native hosting platforms · Drive innovation by constantly reviewing latest public cloud platform database service releases such as Babelfish in AWS to fast track adoption of native services · Ensure security considerations are at the forefront when designing and managing database solutions · Optimise query performance · Ensure all key databases have deep insight monitoring enabled to enable improved capabilities around fault detection · Perform regular database maintenance when required and ensure databases are backed up according to agreed RTO / RPO information · Maintenance work to be planned meticulously to minimise/eradicate self-inflicted P1 outages · Monitoring database costs regularly and identify strategies to minimize cost as part of internal FinOps practices · Ability to provide technical system solutions, determine overall design direction and provide hardware recommendations for complex technical issues · Provisioning, deployment, monitoring cloud environment using automation tools like Terraform |
Skills & Experience: |
Certifications: · SQL Server Database Administration - REQUIRED · AWS Certified Solutions Architect Associate – Highly Desirable · AWS Certified Database Specialty – REQUIRED · Azure Database Administrator Associate – Highly Desirable
Skills & Experience: · Ideal candidate has been supporting traditional server based relational databases for over 8 years who then transitioned into AWS and Azure public cloud for the last 5 years · SQL Server / MSSQL 2008 / 2012 / 2014 / 2016 / 2017 / 2019 (including Always-On and Analysis Services) · Postgres / MYSQL as standalone and managed service platforms · Strong database migration experience (Particularly MSSQL to open source and leveraging AWS native platforms including RDS, Athena, Aurora) · Extensive AWS experience in a commercial environment, architecting database best practices · Strong experience supporting AWS / Azure based datalake/data warehouse environments. Required to support internal BI teams · Solid experience and understanding which workloads are best suitable for which specific database platforms in AWS and Azure · Extensive experience and understanding of database security, including appropriate encryption and authentication best practices · Good working knowledge of Microsoft Power BI · Good knowledge of Azure cloud database services · Any working experience around non-relational databases (internally hosted or managed service such as DynamoDB in AWS will be favoured) · Good working knowledge of Windows and Linux Server Operating Systems · Excellent presentation skills to both an internal and external audience · The ability to share and communicate your specific expertise to the rest of the Technology group
|
Behavioural Fit: |
· Professional appearance and manner · High personal drive; results-oriented; make things happen; “can-do attitude” · Can work and adapt within a highly dynamic and growing environment · Team Player; effective at building close working relationships with others · Effectively manages diversity within the workplace · Strong focus on service delivery and the needs and satisfaction of internal clients · Able to see issues from a global, regional and corporate perspective · Able to effectively plan and manage large projects · Excellent communication skills and interpersonal skills at all levels · Strong analytical, presentation and training skills · Innovative and creative · Visionary and strategic view of technology enablers (creative and innovative) · High verbal and written communication ability, able to influence effectively at all levels · Possesses technical expertise and knowledge to lead by example and input into technical debates · Depth and breadth of experience in infrastructure technologies · Enterprise mentality and a global mindset · Sense of humour
|
Role Key Performance Indicators: |
· Design and deliver repeatable, best in class, cloud database solutions · Pro-actively monitor service quality and take action to scale operational services, in line with business growth · Generate operating efficiencies, to be agreed with Infrastructure Services Manager · Establish a “best in sector” level of operational service delivery and insight · Help creates an effective team · Continuous improvement
|
The candidate must have Expertise in ADF(Azure data factory), well versed with python.
Performance optimization of scripts (code) and Productionizing of code (SQL, Pandas, Python or PySpark, etc.)
Required skills:
Bachelors in - in Computer Science, Data Science, Computer Engineering, IT or equivalent
Fluency in Python (Pandas), PySpark, SQL, or similar
Azure data factory experience (min 12 months)
Able to write efficient code using traditional, OO concepts, modular programming following the SDLC process.
Experience in production optimization and end-to-end performance tracing (technical root cause analysis)
Ability to work independently with demonstrated experience in project or program management
Azure experience ability to translate data scientist code in Python and make it efficient (production) for cloud deployment