Location : Andheri east, Mumbai
Notice Period : Immediate to 15 days
Responsibilities :
- Create algorithms from scratch.
- Create products and backend API's as described by the business team.
- Back-test and create a hypothesis as desired by the research team.
- Code the backend of logics for consumption by the UI team.
- Deploy websockets, Django, rest API's & dynamic TCP, UDP based data flow.
- Deployment and maintenance of codes with version control.
Similar jobs
At Egnyte we develop content governance and collaboration products that are deployed across several large companies such as Yamaha and Red bull. The Egnyte platform supports daily, business critical operations for a million-plus user base interacting with a multi-petabyte content set.
We store, analyze, organize, and secure billions of files and petabytes of data with millions of users. We observe more than 1M API requests per minute on average. To make that possible and to provide the best possible experience, we rely on great engineers. For us, people who own their work from start to finish are integral. Our Engineers are part of the process from design to code, to test, to deployment, and back again for further iterations.
We’re looking for Senior Software Engineers and he should be able to take a complex problem and work with product managers, devops and other team members to execute end to end.
- Design and develop scalable cloud components that seamlessly integrates with on-premises systems.
- Challenge and redefine existing architecture or make 10x improvements in performance and scalability.
- Ability to foresee post-deployment design challenges, performance and scale bottlenecks.
- Hire and mentor junior engineers
- Doing code reviews, unit and performance testing of the code.
- Monitor and manage 3000+ nodes using modern DevOps tools and APM solutions.
- Demonstrated success designing and developing complex cloud based solutions
- Solid CS fundamentals with one or more areas of deep knowledge
- Experience with the following technologies: Java, SQL, Linux, Python, Nginx, Haproxy, BigQuery, HBase, New Relic, memcache, Elasticsearch, docker.
- Data driven decision process
- Relies on automation testing instead of manual QA
- Experience in working with Google cloud, AWS or Azure is preferred
We would prefer the candidate work from our Mumbai office for alteast first 6 months.
- Implement security and data protection solutions
-
Bachelor’s or master’s degree in Computer Engineering, Computer Science, Computer Applications, Mathematics, Statistics, or related technical field. Relevant experience of at least 3 years in lieu of above if from a different stream of education.
-
Well-versed in and 3+ hands-on demonstrable experience with: ▪ Stream & Batch Big Data Pipeline Processing using Apache Spark and/or Apache Flink.
▪ Distributed Cloud Native Computing including Server less Functions
▪ Relational, Object Store, Document, Graph, etc. Database Design & Implementation
▪ Micro services Architecture, API Modeling, Design, & Programming -
3+ years of hands-on development experience in Apache Spark using Scala and/or Java.
-
Ability to write executable code for Services using Spark RDD, Spark SQL, Structured Streaming, Spark MLLib, etc. with deep technical understanding of Spark Processing Framework.
-
In-depth knowledge of standard programming languages such as Scala and/or Java.
-
3+ years of hands-on development experience in one or more libraries & frameworks such as Apache Kafka, Akka, Apache Storm, Apache Nifi, Zookeeper, Hadoop ecosystem (i.e., HDFS, YARN, MapReduce, Oozie & Hive), etc.; extra points if you can demonstrate your knowledge with working examples.
-
3+ years of hands-on development experience in one or more Relational and NoSQL datastores such as PostgreSQL, Cassandra, HBase, MongoDB, DynamoDB, Elastic Search, Neo4J, etc.
-
Practical knowledge of distributed systems involving partitioning, bucketing, CAP theorem, replication, horizontal scaling, etc.
-
Passion for distilling large volumes of data, analyze performance, scalability, and capacity performance issues in Big Data Platforms.
-
Ability to clearly distinguish system and Spark Job performances and perform spark performance tuning and resource optimization.
-
Perform benchmarking/stress tests and document the best practices for different applications.
-
Proactively work with tenants on improving the overall performance and ensure the system is resilient, and scalable.
-
Good understanding of Virtualization & Containerization; must demonstrate experience in technologies such as Kubernetes, Istio, Docker, OpenShift, Anthos, Oracle VirtualBox, Vagrant, etc.
-
Well-versed with demonstrable working experience with API Management, API Gateway, Service Mesh, Identity & Access Management, Data Protection & Encryption.
Hands-on experience with demonstrable working experience with DevOps tools and platforms viz., Jira, GIT, Jenkins, Code Quality & Security Plugins, Maven, Artifactory, Terraform, Ansible/Chef/Puppet, Spinnaker, etc.
-
Well-versed in AWS and/or Azure or and/or Google Cloud; must demonstrate experience in at least FIVE (5) services offered under AWS and/or Azure or and/or Google Cloud in any categories: Compute or Storage, Database, Networking & Content Delivery, Management & Governance, Analytics, Security, Identity, & Compliance (or) equivalent demonstrable Cloud Platform experience.
-
Good understanding of Storage, Networks and Storage Networking basics which will enable you to work in a Cloud environment.
-
Good understanding of Network, Data, and Application Security basics which will enable you to work in a Cloud as well as Business Applications / API services environment.
- Working with Databases and Linux platform
- Understanding algorithms, databases and their space and time complexities
- Writing unit and integration tests with reasonable coverage of code and interfaces
- Solving complex and interesting problems
- Taking up a high level of ownership and commitment towards the business and product vision
What you need to have:
- Minimum 1-year experience
- Strong problem-solving skills
- Good understanding of data structures & algorithms and their space & time complexities
- Strong hands-on and practical working experience with at least one programming language: C/Java/C++/C#
- Excellent coding skills – should be able to convert the design into code fluently
- Strong technical aptitude and a good knowledge of CS fundamentals
- Hands-on experience working with Databases and Linux platform is a plus
- B-Tech in Computer Science or equivalent from a reputed college
- Good experience in at least one general programming language (Java, Ruby, Clojure, Scala, C/C++, Python and SQL)
- A solid foundation in computer science, with strong competencies in data structures, algorithms, and software design.
- Have a penchant for solving complex and interesting problems, Worked in startup like environment with high levels of ownership and commitment
- Excellent coding skills – should be able to convert design into code fluently
- Good skills to write unit & integration tests with reasonable coverage of code & interfaces
- TDD is a plus
inai is building the future of payments
inai is Segment for payments. We make the lives of digital/online merchants easier
by enabling them to manage their payments’ stack in a low-touch / no-code
fashion. Merchants can now future proof their payment stack and break out of the
consistent trade-off between a great checkout experience and the flexibility
required to maintain the same.
inai was founded by serial entrepreneurs who have decades of experience in
finance and tech. inai has been backed by marquee investors including Kunal
Shah, Razorpay, the first investors in Square/Twitter, and other stellar investors.
Background
Back-end engineers will build the core of inai’s platform which comprises of 3
broad teams:
• Front-end: handle the checkout experience for our merchants and the
dashboard their product teams would be using daily
• Integrations: that handle our integrations with various payment processors,
wallets, BNPLs, analytics providers, fraud and risk providers and accounting
software.
• Platform: The glue that holds it all together. APIs will be the norm, databases
your core, scalability, reliability, and system design your everyday concern.
You will
• Develop APIs to integrate with 3rd party systems — primarily in the payments
domain
• Work closely with our front end and integrations engineering teams and also
with our colleagues across the globe.
• Take ownership for the modules you develop and key technology decisions,
and customer issues.
• Contribute towards documentation (internal and customer-facing), code
reviews, tooling, and processes.
You will have
• 2 to 8 years experience as a full stack engineer. Atleast 1+ years in handling
payments recently.
• Experience in working with 3rd party APIs. You should be able to peruse 3rd
party API documentation and retro-fit the APIs with the platform team.
• Experience writing APIs in Python, relational databases
• Can communicate and interact with a larger team
• Experience in working with multiple payment gateways or integrating them
• Experience with handling payment tokens.
Good-to-have
• Experience with OpenAPI, JSON API specs
Benefits
• Health insurance for you and your loved ones - on us.
• Work-life balance: Our success will not come at the expense of your work-life
balance
• Grow with us: Working in the intersection of fintech and SaaS, you will have
all the opportunities to grow in this thriving sector.
• Work from Home: We will help set up your home in your residence in India.
Inai will continue to operate remotely till the situation returns back to normal.
It is India’s only B2B Construction Materials Supply Chain Company. Construction & Infrastructure is one of the industries, with the most complex problems in Supply Chain and we are fundamentally reengineering the way construction materials get procured. Our Clients include the biggest names in India & the World– The LafargeHolcim Group, JSW Group, The Tatas and many others. We are backed by some of the world’s top marquee funds in our journey to create transparency and standardization to an otherwise opaque industry.
Your responsibilities:
- Understand the business context and build high quality code using proven design patterns Develop, test and deploy integrations required to meet business requirements
- Carry out unit tests and other quality control mechanisms to inform and validate the code and design
- Utilizing and monitoring cloud infrastructure resources (such as AWS, Azure) efficiently
- Participate in a highly fluid environment applying agile software development principles
- Ensure the coding standards are on-par with the best in the industry
Educational Qualifications:
- Bachelor's or Master’s degree in a quantitative field (e.g. Mathematics, Engineering, Computer Science).
Must have skills:
- 3+ years of work experience with mobile/web development
- Experience in working with either of languages: Go, Java, Scala
- Strong understanding of relational and non-relational databases (MySQL, PostgreSQL, MongoDB, Cassandra)
- Strong understanding of Message brokers
- Must have hands on experience in Restful APIs
- Must have a strong foundation of Data structures and Algorithms
- Using logic and reasoning to identify the strengths and weaknesses of alternative solutions, conclusions or approaches to problems
- Ability to deploy features on a daily basis: should be good at time management and prioritization
- Hands-on experience in CI/CD principles and TDD
- Strong written and verbal English communication skills
Good to have skills:
- Experience in event-driven & asynchronous I/O frameworks
- Exposure to business process and workflow automation
- Working experience in process driven and data intensive business applications
- Experience in working on web infrastructure with React.js and React Native
Professional traits:
- Self-motivated, persistent and “Never Give Up” attitude
- Passion for innovation and adaptability to a lean startup culture
- Ability to work with minimal supervision, independently and / or as a member of a team
- Giving full attention to what other people are saying, taking time to understand the points being made, asking questions as appropriate
Job Title: Python Developer
Job Location: Pune ( Baner)
Experience: 4 to 8 years
Notice Period: 1 Month or Less
Skills: Python (Django, Flask), MySQL, Strong focus on OOPS and Architecture, Bitbucket/GitHub, NoSQL
Technical Requirements:
- Experience in developing web applications and APIs(REST, XML, other open sources)
- Strong programming foundation in Python, MYSQL & OOPS Experience in Django/Flask
- Experience & Good understanding of HTML5, CSS3, Bootstrap, Ajax, JS etc while experience on Angular, Node JS will be an added advantage.
- Solid exposure of API integrationsand familiar with various design & architectural patterns.
- In depth knowledge of Source Code Repository and experience working with Bit-bucket.
- Experience working on Apache HTTP or any other web/app server.
- Hands on experience in DB design, Architecture, coding, unit testing and debugging.
- Experience working in an Agile development environment.
- Sound in data structure analysis and algorithm design.
- Ensure cross-platform compatibility of information retrieved from web services on Android and iOS platforms, in terms of Push Notifications, platform-specific issues, etc
- Good knowledge of relational databases, version control tools and of developing web services.
- Strong understanding of the software development life cycle and best practices
Roles and Responsibilities:
- Should be a problem solver with an attitude to contribute towards the success of Team/Project as well as organization.
- Should be able to guide other members in the team
- Should take initiatives to improve code quality standards and team efficiency.
- Should be able to Participate in the requirements gathering and come up efficient solutions
- Should be able to Efficiently estimate on high and low level along with assessing risk items
About Dunzo
Not that long ago, we were on whatsapp with a handful of customers, many of whom were friends and family, getting you anything you needed - groceries, food, even sending packages to anywhere in Bangalore. Today, we’re an app covering Bangalore, Pune and Gurgaon, Hyderabad and Delhi. Dunzo is a technology company, that makes local deliveries fast and easy - whether it’s that quick run from your local store, or getting you that book that is available only in one store far away from you or even organizing your entire party!
We were recently voted #3 on LinkedIn’s Top Startups in India and here’s why. The Dunzo team gets an on ground opportunity to shape a product for users across cities. You are shaping cities you’ve grown up in - by making them more accessible than before through the use of technology. We’re solving an extremely tough problem, and we seek the best of minds who are great at problem-solving, passionate and willing to go the extra mile!
About The Team
As a team, we believe that the best idea wins - no matter where the idea comes from. We tackle problems that have existed for years - through technology and data. You'll be joining a vibrant, young team who are passionate about giving our users time back, provide flexible earning opportunities for our Partners and enhance local businesses.
Job Description
Must Haves :
- Proficient in algorithms and data structures.
- Having strong experience in designing data intensive and scalable systems.
- Comfortable with microservices based architecture.
- Knowledge of software engineering processes and unit testing.
- Past experience of mentoring a team of at least two people.
- Prior experience with both relational and non-relational databases.
- Prior experience with caching.
- Ability to debug and hotfix a problem with production in a swift amount of time.
- Communication skills.
Please let me know if you are interested.