

About InfoVision Labs India Pvt. Ltd. Pune
About
Connect with the team
Similar jobs


Level of skills and experience:
5 years of hands-on experience in using Python, Spark,Sql.
Experienced in AWS Cloud usage and management.
Experience with Databricks (Lakehouse, ML, Unity Catalog, MLflow).
Experience using various ML models and frameworks such as XGBoost, Lightgbm, Torch.
Experience with orchestrators such as Airflow and Kubeflow.
Familiarity with containerization and orchestration technologies (e.g., Docker, Kubernetes).
Fundamental understanding of Parquet, Delta Lake and other data file formats.
Proficiency on an IaC tool such as Terraform, CDK or CloudFormation.
Strong written and verbal English communication skill and proficient in communication with non-technical stakeholderst
2. Design software and make technology choices across the stack (from data storage to application to front-end)
3. Understand a range of tier-1 systems/services that power our product to make scalable changes to critical path code
4. Own the design and delivery of an integral piece of a tier-1 system or application
5. Work closely with product managers, UX designers, and end users and integrate software components into a fully functional system
6. Work on the management and execution of project plans and delivery commitments
7. Take ownership of product/feature end-to-end for all phases from the development to the production
8. Ensure the developed features are scalable and highly available with no quality concerns
9. Work closely with senior engineers for refining and implementation
10. Manage and execute project plans and delivery commitments
11. Create and execute appropriate quality plans, project plans, test strategies, and processes for development activities in concert with business and project management efforts
· Core responsibilities to include analyze business requirements and designs for accuracy and completeness. Develops and maintains relevant product.
· BlueYonder is seeking a Senior/Principal Architect in the Data Services department (under Luminate Platform ) to act as one of key technology leaders to build and manage BlueYonder’ s technology assets in the Data Platform and Services.
· This individual will act as a trusted technical advisor and strategic thought leader to the Data Services department. The successful candidate will have the opportunity to lead, participate, guide, and mentor other people in the team on architecture and design in a hands-on manner. You are responsible for technical direction of Data Platform. This position reports to the Global Head, Data Services and will be based in Bangalore, India.
· Core responsibilities to include Architecting and designing (along with counterparts and distinguished Architects) a ground up cloud native (we use Azure) SaaS product in Order management and micro-fulfillment
· The team currently comprises of 60+ global associates across US, India (COE) and UK and is expected to grow rapidly. The incumbent will need to have leadership qualities to also mentor junior and mid-level software associates in our team. This person will lead the Data platform architecture – Streaming, Bulk with Snowflake/Elastic Search/other tools
Our current technical environment:
· Software: Java, Springboot, Gradle, GIT, Hibernate, Rest API, OAuth , Snowflake
· • Application Architecture: Scalable, Resilient, event driven, secure multi-tenant Microservices architecture
· • Cloud Architecture: MS Azure (ARM templates, AKS, HD insight, Application gateway, Virtue Networks, Event Hub, Azure AD)
· Frameworks/Others: Kubernetes, Kafka, Elasticsearch, Spark, NOSQL, RDBMS, Springboot, Gradle GIT, Ignite


non-metro and rural markets. DealShare has raised series C funding of USD 21 million with key investors like WestBridge Capital, Falcon Edge Capital, Matrix Partners India, Omidyar Network, Z3 Partners and Partners of DST Global and has a total funding of USD 34 million.They have 2 million customers across Rajasthan, Gujarat, Maharashtra, Karnataka and Delhi NCR with monthly transactions of 1.2 million and annual GMV of $100 million. Our aim is to expand operations to 100 cities across India and reach annual GMV of USD 500 Million by end of 2021.
They started in Sept 2018 and had 5000 active customers in the first three months. Today
we have 25K transactions per day, 1 Lakh DAU and 10 Lakh MAU with a monthly GMV of INR 100 Crores and 50% growth MoM. We aim to hit 2 Lakh transactions per day with an annual GMV of 500 Million USD by 2021.
We are hiring for various teams in discovery (search, recommendation, merchandising,
intelligent notifications) , pricing (automated pricing, competition price awareness, balancing revenue with profits, etc), user growth and retention (bargains, gamification), monetisation (ads), order fulfillment (cart/checkout, warehousing, last mile, delivery promise, demand forecasting), customer support, data infrastructure (warehousing, analytics), ML infrastructure (data versioning, model repository, model training, model hosting, feature store, etc). We are looking for passionate problem solvers to join us and solve really challenging problems and scale DealShare systems
You will:
● Implement the solve with minimal guidance after solutioning closure with senior engineers.
● Write code that has good low level design and is easy to understand, maintain, extend
and test.
● End to end ownership of product/feature from development to production and fixing
issues
● Ensure high unit, functional and integration automated test coverage. Ensure releases
are stable.
● Communicate with various stakeholders (product, QA, senior engineers) as necessary to
ensure quality deliverables, smooth execution and launch.
● Participate in code reviews, improve development and testing processes.
● Participate in hiring great engineers
Required:
● Bachelor’s degree (4 years) or higher in Computer Science or equivalent and 1-3 years
of experience in software development
● Excellent at problem solving, is a self thinker.
● Good understanding of computer science fundamentals, data structures and algorithms
and object oriented design.
● Good coding skills in any object oriented language (C++, Java, Scala, etc), preferably in
Java.
● Prior experience in building one or more modules of large-scale, highly available, low
latency, high quality distributed system is preferred.
● Extremely good at problem solving, is a self thinker.
● Ability to multitask and thrive in a fast paced timeline-driven environment.
● Good team player and ability to collaborate with others
● Self driven and motivated, very high on ownership
Is a plus
● Prior experience of working in Java
● Prior experience of using AWS offerings - EC2, S3, DynamoDB, Lambda, API Gateway,
Cloudfront, etc
● Prior experience of working on big data technologies - Spark, Hadoop, etc
● Prior experience on asynchronous processing (queuing systems), workflow systems.
Cloudera Data Warehouse Hive team looking for a passionate senior developer to join our growing engineering team. This group is targeting the biggest enterprises wanting to utilize Cloudera’s services in a private and public cloud environment. Our product is built on open source technologies like Hive, Impala, Hadoop, Kudu, Spark and so many more providing unlimited learning opportunities.
A Day in the Life
Over the past 10+ years, Cloudera has experienced tremendous growth making us the leading contributor to Big Data platforms and ecosystems and a leading provider for enterprise solutions based on Apache Hadoop. You will work with some of the best engineers in the industry who are tackling challenges that will continue to shape the Big Data revolution. We foster an engaging, supportive, and productive work environment where you can do your best work. The team culture values engineering excellence, technical depth, grassroots innovation, teamwork, and collaboration.
You will manage product development for our CDP components, develop engineering tools and scalable services to enable efficient development, testing, and release operations. You will be immersed in many exciting, cutting-edge technologies and projects, including collaboration with developers, testers, product, field engineers, and our external partners, both software and hardware vendors.
Opportunity:
Cloudera is a leader in the fast-growing big data platforms market. This is a rare chance to make a name for yourself in the industry and in the Open Source world. The candidate will responsible for Apache Hive and CDW projects. We are looking for a candidate who would like to work on these projects upstream and downstream. If you are curious about the project and code quality you can check the project and the code at the following link. You can start the development before you join. This is one of the beauties of the OSS world.
https://hive.apache.org/" target="_blank">Apache Hive
Responsibilities:
-
Build robust and scalable data infrastructure software
-
Design and create services and system architecture for your projects
-
Improve code quality through writing unit tests, automation, and code reviews
-
The candidate would write Java code and/or build several services in the Cloudera Data Warehouse.
-
Worked with a team of engineers who reviewed each other's code/designs and held each other to an extremely high bar for the quality of code/designs
-
The candidate has to understand the basics of Kubernetes.
-
Build out the production and test infrastructure.
-
Develop automation frameworks to reproduce issues and prevent regressions.
-
Work closely with other developers providing services to our system.
-
Help to analyze and to understand how customers use the product and improve it where necessary.
Qualifications:
-
Deep familiarity with Java programming language.
-
Hands-on experience with distributed systems.
-
Knowledge of database concepts, RDBMS internals.
-
Knowledge of the Hadoop stack, containers, or Kubernetes is a strong plus.
-
Has experience working in a distributed team.
-
Has 3+ years of experience in software development.


• Proficient in software development from inception to production releases using modern
programming languages ( Preferably Java, NodeJS, and Scala)
• Hands-on experience with cloud infrastructure, solution architecture on AWS or Azure
• Prior experience working as a Full-stack engineer building cloud-native, SaaS products.
• Expertise in programming and designing circuit breakers, the localized impact of failures,
service mesh, event sourcing, distributed data transactions, and eventual consistency.
• Proficient in designing and developing SAAS on Microservices architecture
• Proficient in building Fault tolerance, High availability, and Autoscaling for microservices
• Proficient in Data Modelling for distributed computing
• Deeps Hands-on experience on Microservices in Spring Boot and in large scale projects in
Spring Framework
• Fluency in cloud-native solution architecture; designing HA and Fault-Tolerant deployment
topologies for API Gateway, Kafka, and Spark clusters on cloud.
• Fluency in AWS, Azure, Serverless Functions in AWS or Azure and in Docker and Kubernetes
• Avid practitioner and coach of Test-Driven Development
• Deep understanding of modeling real-world scheduling and process problems into algorithms
running on memory and compute efficient data structures.
• We value Polyglot engineers a lot, hence experience in programming in more than one
language is a must, preferably one of Groovy, Scala, Python or Kotlin
• Excellent communication skills and collaboration temperament
• Articulation of technical matters to Business Stakeholders, and the ability to translate business
concerns into technical specifications.
• Proficiency in working with cross-functional team on refining initiatives to objective features.
Good To Have:
• Hands-on experience with Continuous Delivery and DevOps automation
• SRE and Observability implementation experience
• Refactoring Legacy products to microservices


Company Introduction
AutoScheduler is looking for a remote senior software developer to join our talented team. The ideal candidate is a self-starter who is interested in constant learning. We want this person to join our dynamic team as we take an established software and re-develop it from scratch as a part of a new start up.
Job Description
We are looking for a Backend C++ developer responsible for maintaining and developing algorithmic C++ for new and existing customers. Your primary responsibilities will be to design and develop applications and services, and to coordinate with the rest of the team working on different layers of the infrastructure. Therefore, a commitment to collaborative problem solving, robust design, and quality product is essential.
Responsibilities
- Work with non-technical personnel to translate business requirements into stories and epics
- Translate application storyboards and use cases into functional applications / features
- Design, build, and maintain efficient, reusable, and reliable code
- Ensure the best possible performance, quality, and responsiveness of applications
- Identify bottlenecks and bugs, and devise solutions to these problems
- Help maintain code quality, organization, and automation
- Write and maintain unit/functional/integration tests
Requirements
- Bachelor’s degree with STEM concentration + 2 years’ work experience, or equivalent work experience
- Fluent in English
- Demonstrably proficient in C/C++ and the ability to parse C/C++ code
- Proficiency in Node.js and Javascript
- Basic understanding of Common Language Runtime (CLR), its limitations, weaknesses, and workarounds
- Working knowledge of at least two other programming languages (e.g. Go and Python)
- Proficient understanding of modern distributed code versioning tools (like ‘git’)
- Thorough understanding of SQL, and in-depth experience with at least one RDBMS (e.g. PostgreSQL or Microsoft SQL Server)
- Experience with automated testing frameworks and unit tests
- Solid understanding of object-oriented programming and principles
- Knowledge of functional programming principals / concepts
Desired Skills
- Experience building applications with C++
- Experience building cross-platform applications with Scala
- Experience with functional programming
- Familiarity with concurrency patterns in Scala
- Experience building distributed systems and/or decoupled microservices
- Experience building software using cloud-based services (in any cloud platform)
- Experience using “gRPC” and Protocol Buffers (“protobuf”) and/or experience creating services that exchange non-JSON data over non-HTTP protocols
- Familiarity with continuous integration
- Familiarity with Docker / containerization


Expereince: 3-5 years
Domain: SQL server/SSIS/Cloud technology
Good Knowledge in Creating new tables in database.
Using Triggers, trunket,delete,view,etc

Spark / Scala experience should be more than 2 years.
Combination with Java & Scala is fine or we are even fine with Big Data Developer with strong Core Java Concepts. - Scala / Spark Developer.
Strong proficiency Scala on Spark (Hadoop) - Scala + Java is also preferred
Complete SDLC process and Agile Methodology (Scrum)
Version control / Git



