Mandate skills set: Azure and Python with Data Framework
Experience: 5-8yrs
Package : 1Lakh per month
Mode: C2H (6months-1year)
About Upboot Consulting Services
Similar jobs
Responsibilities:
- Data science model review, run the code refactoring and optimization, containerization, deployment, versioning, and monitoring of its quality.
- Design and implement cloud solutions, build MLOps on the cloud (preferably AWS)
- Work with workflow orchestration tools like Kubeflow, Airflow, Argo, or similar tools
- Data science models testing, validation, and test automation.
- Communicate with a team of data scientists, data engineers, and architects, and document the processes.
Eligibility:
- Rich hands-on experience in writing object-oriented code using python
- Min 3 years of MLOps experience (Including model versioning, model and data lineage, monitoring, model hosting and deployment, scalability, orchestration, continuous learning, and Automated pipelines)
- Understanding of Data Structures, Data Systems, and software architecture
- Experience in using MLOps frameworks like Kubeflow, MLFlow, and Airflow Pipelines for building, deploying, and managing multi-step ML workflows based on Docker containers and Kubernetes.
- Exposure to deep learning approaches and modeling frameworks (PyTorch, Tensorflow, Keras, etc. )
Responsibilities
Data associates play a critical role and work on various data or content-focused projects across the organisation. The role is work from home, with a monthly in-person meet up with other team members in local region.
- Analyze large set of unstructured data, extract insights and store in data management systems.
- Research, gather, write informational news articles and stories
- Categorize entities based on a set of rules and gained knowledge and experience, dashboard information, and validation with proprietary algorithms and data-driven heuristics.
- Analyze the market (including competitors) and ensure a rich level of data quality across all products and platforms across.
- Complete ad hoc data retrieval and analysis using relational databases, Excel and other data management systems.
- Monitor existing metrics as well as develop and propose new metrics to make actionable intelligence available to business stakeholders.
- Support cross-functional teams on the day-to-day execution of projects and initiatives.
- Communicate insights to key stakeholders.
Requirements
- Preferred: Bachelors in Engineering or Science.
- GPA of 8+ or an overall score of 80%+.
- Location: Coimbatore / Remote
- Strong analytical and problem solving skills with focus on quality and detail orientation.
- Strong proficiency in English reading, written, communication.
- Strong work ethic and personal initiative, reliable self-starter that is capable of working with a high degree of autonomy.
- Ability to work across global cross-office teams and in a team environment.
- Excellent organizational and task management skills.
- Strong verbal and written communication skills with the ability to articulate results and issues to internal and client teams.
- Process management, improvement focus, and willingness to learn cutting edge tool and technology.
Minimum of 8 years of experience of which, 4 years should be of applied data mining
experience in disciplines such as Call Centre Metrics.
Strong experience in advanced statistics and analytics including segmentation, modelling, regression, forecasting etc.
Experience with leading and managing large teams.
Demonstrated pattern of success in using advanced quantitative analytic methods to solve business problems.
Demonstrated experience with Business Intelligence/Data Mining tools to work with
data, investigate anomalies, construct data sets, and build models.
Critical to share details on projects undertaken (preferably on telecom industry)
specifically through analysis from CRM.
XpressBees – a logistics company started in 2015 – is amongst the fastest growing
companies of its sector. While we started off rather humbly in the space of
ecommerce B2C logistics, the last 5 years have seen us steadily progress towards
expanding our presence. Our vision to evolve into a strong full-service logistics
organization reflects itself in our new lines of business like 3PL, B2B Xpress and cross
border operations. Our strong domain expertise and constant focus on meaningful
innovation have helped us rapidly evolve as the most trusted logistics partner of
India. We have progressively carved our way towards best-in-class technology
platforms, an extensive network reach, and a seamless last mile management
system. While on this aggressive growth path, we seek to become the one-stop-shop
for end-to-end logistics solutions. Our big focus areas for the very near future
include strengthening our presence as service providers of choice and leveraging the
power of technology to improve efficiencies for our clients.
Job Profile
As a Lead Data Engineer in the Data Platform Team at XpressBees, you will build the data platform
and infrastructure to support high quality and agile decision-making in our supply chain and logistics
workflows.
You will define the way we collect and operationalize data (structured / unstructured), and
build production pipelines for our machine learning models, and (RT, NRT, Batch) reporting &
dashboarding requirements. As a Senior Data Engineer in the XB Data Platform Team, you will use
your experience with modern cloud and data frameworks to build products (with storage and serving
systems)
that drive optimisation and resilience in the supply chain via data visibility, intelligent decision making,
insights, anomaly detection and prediction.
What You Will Do
• Design and develop data platform and data pipelines for reporting, dashboarding and
machine learning models. These pipelines would productionize machine learning models
and integrate with agent review tools.
• Meet the data completeness, correction and freshness requirements.
• Evaluate and identify the data store and data streaming technology choices.
• Lead the design of the logical model and implement the physical model to support
business needs. Come up with logical and physical database design across platforms (MPP,
MR, Hive/PIG) which are optimal physical designs for different use cases (structured/semi
structured). Envision & implement the optimal data modelling, physical design,
performance optimization technique/approach required for the problem.
• Support your colleagues by reviewing code and designs.
• Diagnose and solve issues in our existing data pipelines and envision and build their
successors.
Qualifications & Experience relevant for the role
• A bachelor's degree in Computer Science or related field with 6 to 9 years of technology
experience.
• Knowledge of Relational and NoSQL data stores, stream processing and micro-batching to
make technology & design choices.
• Strong experience in System Integration, Application Development, ETL, Data-Platform
projects. Talented across technologies used in the enterprise space.
• Software development experience using:
• Expertise in relational and dimensional modelling
• Exposure across all the SDLC process
• Experience in cloud architecture (AWS)
• Proven track record in keeping existing technical skills and developing new ones, so that
you can make strong contributions to deep architecture discussions around systems and
applications in the cloud ( AWS).
• Characteristics of a forward thinker and self-starter that flourishes with new challenges
and adapts quickly to learning new knowledge
• Ability to work with a cross functional teams of consulting professionals across multiple
projects.
• Knack for helping an organization to understand application architectures and integration
approaches, to architect advanced cloud-based solutions, and to help launch the build-out
of those systems
• Passion for educating, training, designing, and building end-to-end systems.
Remote Work, US shift
General Scope and Summary
The Data and Analytics Team sits in the Digital and Enterprise Capabilities Group and is responsible for driving the strategy, implementation and delivery of Data,
Analytics and Automation capabilities across Enterprise.
This global team will deliver “Next-Gen Value” by establishing core Data and Analytics capabilities needed to effectively manage and exploit Data as an Enterprise Asset. Data Platform Operations will be responsible for implementing and supporting Enterprise Data Operations tools and capabilities which will enable teams
to answer strategic and business questions through data .
Roles and Responsibilities
● Manage overall data operations ensuring adherence to data quality metrics by establishing standard operating procedures and best practices/playbooks.
● Champion the advocacy and adoption of enterprise data assets for analytics and analytics through optimal operating models.
● Provide day-to-day ownership and project management data operations activities including data quality/data management support cases and other ad-hoc requests.
● Create standards, frameworks for CI/CD pipelines and DevOps.
● Collaborative cross-functionally to develop and implement data operations policies balancing centralized control and standardization with decentralized speed and flexibility.
● Identify areas for improvement. Create procedures, teams, and policies to support near real-time clean data, where applicable, or in a batch and close process, where applicable.
● Improve processes by tactically focusing on business outcomes. Drive prioritization based on business needs and strategy.
● Lead and control workflow operations by driving critical issues and discussions with partners to identify and implement improvements.
● Responsible for defining, measuring, monitoring, and reporting of key SLA metrics to support its vision.
Experience, Education and Specialized Knowledge and Skills
Must thrive working in a fast-paced, innovative environment while remaining flexible, proactive, resourceful, and efficient. Strong interpersonal skills, ability to understand
stakeholder pain points, ability to analyze complex issues to develop relevant and realistic solutions and recommendations. Demonstrated ability to translate strategy into action; excellent technical skills and an ability to communicate complex issues in a simple way and to orchestrate solutions to resolve issues and mitigate risks.
About the Company:
It is a Data as a Service company that helps businesses harness the power of data. Our technology fuels some of the most interesting big data projects of the word. We are a small bunch of people working towards shaping the imminent data-driven future by solving some of its fundamental and toughest challenges.
Role: We are looking for an experienced team lead to drive data acquisition projects end to end. In this role, you will be working in the web scraping team with data engineers, helping them solve complex web problems and mentor them along the way. You’ll be adept at delivering large-scale web crawling projects, breaking down barriers for your team and planning at a higher level, and getting into the detail to make things happen when needed.
Responsibilities
- Interface with clients and sales team to translate functional requirements into technical requirements
- Plan and estimate tasks with your team, in collaboration with the delivery managers
- Engineer complex data acquisition projects
- Guide and mentor your team of engineers
- Anticipate issues that might arise and proactively consider those into design
- Perform code reviews and suggest design changes
Prerequisites
- Between 5-8 years of relevant experience
- Fluent programming skills and well-versed with scripting languages like Python or Ruby
- Solid foundation in data structures and algorithms
- Excellent tech troubleshooting skills
- Good understanding of web data landscape
- Prior exposure to DOM, XPATH and hands on experience with selenium/automated testing is a plus
Skills and competencies
- Prior experience with team handling and people management is mandatory
- Work independently with little to no supervision
- Extremely high attention to detail
- Ability to juggle between multiple projects
PriceLabs (https://www.chicagobusiness.com/innovators/what-if-you-could-adjust-prices-meet-demand" target="_blank">chicagobusiness.com/innovators/what-if-you-could-adjust-prices-meet-demand) is a cloud based software for vacation and short term rentals to help them dynamically manage prices just the way large hotels and airlines do! Our mission is to help small businesses in the travel and tourism industry by giving them access to advanced analytical systems that are often restricted to large companies.
We're looking for someone with strong analytical capabilities who wants to understand how our current architecture and algorithms work, and help us design and develop long lasting solutions to address those. Depending on the needs of the day, the role will come with a good mix of team-work, following our best practices, introducing us to industry best practices, independent thinking, and ownership of your work.
Responsibilities:
- Design, develop and enhance our pricing algorithms to enable new capabilities.
- Process, analyze, model, and visualize findings from our market level supply and demand data.
- Build and enhance internal and customer facing dashboards to better track metrics and trends that help customers use PriceLabs in a better way.
- Take ownership of product ideas and design discussions.
- Occasional travel to conferences to interact with prospective users and partners, and learn where the industry is headed.
Requirements:
- Bachelors, Masters or Ph. D. in Operations Research, Industrial Engineering, Statistics, Computer Science or other quantitative/engineering fields.
- Strong understanding of analysis of algorithms, data structures and statistics.
- Solid programming experience. Including being able to quickly prototype an idea and test it out.
- Strong communication skills, including the ability and willingness to explain complicated algorithms and concepts in simple terms.
- Experience with relational databases and strong knowledge of SQL.
- Experience building data heavy analytical models in the travel industry.
- Experience in the vacation rental industry.
- Experience developing dynamic pricing models.
- Prior experience working at a fast paced environment.
- Willingness to wear many hats.
-
Owns the end to end implementation of the assigned data processing components/product features i.e. design, development, dep
loyment, and testing of the data processing components and associated flows conforming to best coding practices -
Creation and optimization of data engineering pipelines for analytics projects.
-
Support data and cloud transformation initiatives
-
Contribute to our cloud strategy based on prior experience
-
Independently work with all stakeholders across the organization to deliver enhanced functionalities
-
Create and maintain automated ETL processes with a special focus on data flow, error recovery, and exception handling and reporting
-
Gather and understand data requirements, work in the team to achieve high-quality data ingestion and build systems that can process the data, transform the data
-
Be able to comprehend the application of database index and transactions
-
Involve in the design and development of a Big Data predictive analytics SaaS-based customer data platform using object-oriented analysis
, design and programming skills, and design patterns -
Implement ETL workflows for data matching, data cleansing, data integration, and management
-
Maintain existing data pipelines, and develop new data pipeline using big data technologies
-
Responsible for leading the effort of continuously improving reliability, scalability, and stability of microservices and platform
- Data pre-processing, data transformation, data analysis, and feature engineering
- Performance optimization of scripts (code) and Productionizing of code (SQL, Pandas, Python or PySpark, etc.)
- Required skills:
- Bachelors in - in Computer Science, Data Science, Computer Engineering, IT or equivalent
- Fluency in Python (Pandas), PySpark, SQL, or similar
- Azure data factory experience (min 12 months)
- Able to write efficient code using traditional, OO concepts, modular programming following the SDLC process.
- Experience in production optimization and end-to-end performance tracing (technical root cause analysis)
- Ability to work independently with demonstrated experience in project or program management
- Azure experience ability to translate data scientist code in Python and make it efficient (production) for cloud deployment
Job role:
As a data analyst, you will be responsible for compiling actionable insights from data and assisting program, sales and marketing managers build data-driven processes. Your role will involve driving initiatives to optimize for operational excellence and revenue.
Job Location: Indore | Full-Time Internship | Stipend - Performance Based |
About the company:
Anaxee Digital Runners is building India's largest last-mile verification & data collection network of Digital Runners (shared feet-on-street, tech-enabled) to help Businesses & Consumers reach remotest parts of India, on-demand. KYC | Field Verification | Data Collection | eSign | Tier-2, 3 & 4
Sounds like a moonshot? It is. We want to make REACH across India (remotest places), as easy as ordering pizza, on-demand. Already serving 11000 pin codes (57% of India) | Website: www.anaxee.com
Important: https://www.youtube.com/watch?v=7QnyJsKedz8" target="_blank">Check out our company pitch (6 min video) to understand this goal - https://www.youtube.com/watch?v=7QnyJsKedz8" target="_blank">https://www.youtube.com/watch?v=7QnyJsKedz8
Responsibilities:
- Ensure that data flows smoothly from source to destination so that it can be processed
- Utilize strong database skills to work with large, complex data sets to extract insights
- Filter and cleanse unstructured (or ambiguous) data into usable data sets that can be analyzed to extract insights and improve business processes
- Identify new internal and external data sources to support analytics initiatives and work with appropriate partners to absorb the data into new or existing data infrastructure
- Build tools for automating repetitive tasks so that bandwidth can be freed for analytics
- Collaborate with program managers and business analysts to help them come up with actionable, high-impact insights across product lines and functions
- Work closely with top management to prioritize information and analytic needs
Requirements:
- Bachelors or Masters (Pursuing or Graduated) in a quantitative field (such as Engineering, Statistics, Math, Economics, or Computer Science with Modeling/Data Science), preferably with work experience of over [X] years.
- Ability to program in any high-level language is required. Familiarity with R and statistical packages are preferred.
- Proven problem solving and debugging skills.
- Familiar with database technologies and tools (SQL/R/SAS/JMP etc.), data warehousing, transformation, and processing. Work experience with real data for customer insights, business, and market analysis will be advantageous.
- Experience with text analytics, data mining and social media analytics.
- Statistical knowledge in standard techniques: Logistic Regression, Classification models, Cluster Analysis, Neural Networks, Random Forests, Ensembles, etc.