About InfoVision Labs India Pvt. Ltd. Pune
About
Connect with the team
Similar jobs
· Core responsibilities to include analyze business requirements and designs for accuracy and completeness. Develops and maintains relevant product.
· BlueYonder is seeking a Senior/Principal Architect in the Data Services department (under Luminate Platform ) to act as one of key technology leaders to build and manage BlueYonder’ s technology assets in the Data Platform and Services.
· This individual will act as a trusted technical advisor and strategic thought leader to the Data Services department. The successful candidate will have the opportunity to lead, participate, guide, and mentor other people in the team on architecture and design in a hands-on manner. You are responsible for technical direction of Data Platform. This position reports to the Global Head, Data Services and will be based in Bangalore, India.
· Core responsibilities to include Architecting and designing (along with counterparts and distinguished Architects) a ground up cloud native (we use Azure) SaaS product in Order management and micro-fulfillment
· The team currently comprises of 60+ global associates across US, India (COE) and UK and is expected to grow rapidly. The incumbent will need to have leadership qualities to also mentor junior and mid-level software associates in our team. This person will lead the Data platform architecture – Streaming, Bulk with Snowflake/Elastic Search/other tools
Our current technical environment:
· Software: Java, Springboot, Gradle, GIT, Hibernate, Rest API, OAuth , Snowflake
· • Application Architecture: Scalable, Resilient, event driven, secure multi-tenant Microservices architecture
· • Cloud Architecture: MS Azure (ARM templates, AKS, HD insight, Application gateway, Virtue Networks, Event Hub, Azure AD)
· Frameworks/Others: Kubernetes, Kafka, Elasticsearch, Spark, NOSQL, RDBMS, Springboot, Gradle GIT, Ignite
Responsibilities:
- Programming
- Full stack system development
- Product Design
- Technical risk assessment and estimation
- Technical investigation/research, assessment, and recommendation
- Providing solutions and preparing proof-of-concept for technical proposals and demonstrations
- Product maintenance and support
Qualifications:
- Bachelor's degree in computer science, Computer Engineering, or any course related to Information Technology, Engineering and Mathematics
- Ability to execute full software development life cycle (SDLC)
- 1+ year experience in product development
- 1+ years Java development experience
- 1+ year experience in agile product development methodology/process
- Develop flowcharts, layouts, and documentation to identify requirements and solutions
- Experience of relational databases, SQL / MySql / PL SQL / Oracle
- Experience in JAVA Frameworks (Spring / Spring MVC / Spring boot / Hibernate)
- Integrate software components into a fully functional software system
- Develop software verification plans and quality assurance procedures
- Document and maintain software functionality
- Troubleshoot, debug and upgrade existing systems
- Experience in Unit testing is a plus
Knowledge in:
- JavaScript / ES6 / JavaScript Reactive Framework (Vue.js / java servlets is a huge plus)
- REST Concepts
- VCS - Git
- AWS
- Excellent programming and problem-solving skills
- Experience with test-driven development
- Good communication skills
- Fast learner, detail-oriented
- Able to work under pressure
- Self-managing and able to collaborate with offsite team members
- Can render extra hours whenever necessary.
Training
After a rigorous training program of up to one month, you'll immediately get to work on one of our projects. We're working on mission-critical government systems or commercial products that are levelling up the way the world does business.
lives of API developers and consumers easier. If you love thinking big and delving deep and
enjoy envisioning truly elegant solutions, this role is definitely for you.
What you will be Doing
- You will abstract away complex data interactions with easy-to-use APIs that will power several
mobile and web applications.
- You will also own, scale, and maintain the computational and storage infrastructure for the
various micro-services and long-running jobs, designed and implemented by you and the team.
- We will look to you to make key decisions on the technology stack, architecture, networking,
and security. We love working with bleeding-edge technology, especially if it improves the
malleability, and simplicity of our deliverables.
What you need
- The ideal Backend Engineers are polyglots who are fluent in HTTP and core CS concepts such
as algorithms, data structures, and programming paradigms, and always pick the right tools for
the right job.
- They have a keen eye for common security vulnerabilities and how to act on them (example:
DDOS attacks, SQL Injection etc.).
- They understand what it takes to work in a startup environment and know when to trade
performance for simplicity.
- They fail fast, learn faster, and execute in time.
- Strong communication skills, get-things-done attitude, and empathy
- Strong sense of ownership, drive and obsessive attention to detail.
- Comfortable with iterative development practices and code reviews
- Previous experience as part of a product-oriented team is a plus
Technical Skillsets:
NodeJS + Javascript, GoLang, Typescript + Nodejs, Clojure/Haskell/F#/Scala
(languages/environments)
Koa, Express, Play (frameworks)
Asynchronous Programming Frameworks (Akka, Nodejs, Tornado)
MongoDB, postgres, Bigtable, Dynamo (databases)
Apache Kafka, NATS, RabbitMQ, ZeroMQ (queues)
FunctionalProgramming, FRP (functional reactive)
microservices, multi-tenant, distributed-systems, distributed-computing, event-sourcing
Good to have Skillsets:
Clojure/Haskell/F#/Scala
Apache Kafka, NATS
FunctionalProgramming, FRP (functional reactive)
event-sourcing
Koa
Why you should consider this role seriously?
- We have an audacious vision of helping companies fight counterfeiting and managing their
supply chain more efficiently
- We have built a product and solved problems for some of the largest brands in the country and
tested platform at scale (With our tags present in over 50 Million products already). We have
plans to grow 10x in the next 1 year
- Ownership of key problems. Fast-paced environment
- We are a well-balanced team of experienced entrepreneurs and are backed by top investors
across India and the Silicon Valley (Venture Highway, Startup Buddy etc.)
- Competitive market salary
- Opportunity to work directly with the CEO, COO, CTO of the company
- A chance to interact with top-notch executives from multiple industries
- Open vacation policy (and we really mean it!)
- Open Pantry
- As a team, we love to travel :). An off-site every quarter
- Augmenting, improving, redesigning, and/or re-implementing Dolat's low-latency/high-throughput production trading environment, which collects data from and disseminates orders to exchanges around the world
- Optimizing this platform by using network and systems programming, as well as other advanced techniques
- Developing systems that provide easy access to historical market data and trading simulations
- Building risk-management and performance-tracking tools
- Shaping the future of Dolat through regular interviewing and infrequent campus recruiting trips
- Implementing domain-optimized data structures
- Learn and internalize the theories behind current trading system
- Participate in the design, architecture and implementation of automated trading systems
- Take ownership of system from design through implementation
-
Bachelor’s or master’s degree in Computer Engineering, Computer Science, Computer Applications, Mathematics, Statistics, or related technical field. Relevant experience of at least 3 years in lieu of above if from a different stream of education.
-
Well-versed in and 3+ hands-on demonstrable experience with: ▪ Stream & Batch Big Data Pipeline Processing using Apache Spark and/or Apache Flink.
▪ Distributed Cloud Native Computing including Server less Functions
▪ Relational, Object Store, Document, Graph, etc. Database Design & Implementation
▪ Micro services Architecture, API Modeling, Design, & Programming -
3+ years of hands-on development experience in Apache Spark using Scala and/or Java.
-
Ability to write executable code for Services using Spark RDD, Spark SQL, Structured Streaming, Spark MLLib, etc. with deep technical understanding of Spark Processing Framework.
-
In-depth knowledge of standard programming languages such as Scala and/or Java.
-
3+ years of hands-on development experience in one or more libraries & frameworks such as Apache Kafka, Akka, Apache Storm, Apache Nifi, Zookeeper, Hadoop ecosystem (i.e., HDFS, YARN, MapReduce, Oozie & Hive), etc.; extra points if you can demonstrate your knowledge with working examples.
-
3+ years of hands-on development experience in one or more Relational and NoSQL datastores such as PostgreSQL, Cassandra, HBase, MongoDB, DynamoDB, Elastic Search, Neo4J, etc.
-
Practical knowledge of distributed systems involving partitioning, bucketing, CAP theorem, replication, horizontal scaling, etc.
-
Passion for distilling large volumes of data, analyze performance, scalability, and capacity performance issues in Big Data Platforms.
-
Ability to clearly distinguish system and Spark Job performances and perform spark performance tuning and resource optimization.
-
Perform benchmarking/stress tests and document the best practices for different applications.
-
Proactively work with tenants on improving the overall performance and ensure the system is resilient, and scalable.
-
Good understanding of Virtualization & Containerization; must demonstrate experience in technologies such as Kubernetes, Istio, Docker, OpenShift, Anthos, Oracle VirtualBox, Vagrant, etc.
-
Well-versed with demonstrable working experience with API Management, API Gateway, Service Mesh, Identity & Access Management, Data Protection & Encryption.
Hands-on experience with demonstrable working experience with DevOps tools and platforms viz., Jira, GIT, Jenkins, Code Quality & Security Plugins, Maven, Artifactory, Terraform, Ansible/Chef/Puppet, Spinnaker, etc.
-
Well-versed in AWS and/or Azure or and/or Google Cloud; must demonstrate experience in at least FIVE (5) services offered under AWS and/or Azure or and/or Google Cloud in any categories: Compute or Storage, Database, Networking & Content Delivery, Management & Governance, Analytics, Security, Identity, & Compliance (or) equivalent demonstrable Cloud Platform experience.
-
Good understanding of Storage, Networks and Storage Networking basics which will enable you to work in a Cloud environment.
-
Good understanding of Network, Data, and Application Security basics which will enable you to work in a Cloud as well as Business Applications / API services environment.
● We believe that the role of an engineer at a typical product company in India has to evolve from just working in a request response mode to something more involved.
● Typically an engineer has very little to no connection with the product, its users, overall success criteria or long term vision of the product that he/she is working on.
● The system is not setup to encourage it. Engineers are evaluated on their tech prowess and very little attention is given to other aspects of being a successful engineer.
● We don’t hold appraisals as we don’t believe that evaluation of work and feedback is a constant affair rather than every 6 or 12 months. Besides there is no better testament of your abilities than the growth of the product.
● We don’t have a concept of hierarchy and hence we don’t have promotions. All we have in Udaan are Software Engineers.
Skills & Knowledge:
○ 4-15 years of experience
○ Sound knowledge in Programming,
○ High Ownership & Impact oriented
○ Creative thinker & Implementation
○ Highly Customer Obsessed & Always Insisting on Highest Standards
Work Location: Indira Nagar, Bangalore
Work Days: Sunday to Thursday OR Monday to Friday
Shift: Day Time
Week Off: Friday & Saturday OR Saturday & Sunday
JD:
Development of applications in Java including:
Building data processing platforms.
Developing micro service-oriented applications (Mandatory).
Interact with stakeholders of the applications being developed.
Desired Candidate Profile:
Must have experience in Java JEE, Spring Framework, Microservices (Mandator)
Experience in SQL and JDBC
Experience in build tools Maven, git
Experience in Cloud Platforms AWS, Azure is a plus.
Expereince: 3-5 years
Domain: SQL server/SSIS/Cloud technology
Good Knowledge in Creating new tables in database.
Using Triggers, trunket,delete,view,etc