Rapido looking for a team member who has a background in data modeling, distributed system design, and communications protocol and is passionate about writing code and the art of management. We are a close-knit group spanning the full gamut of hardware and software design, and are looking for candidates with similarly broad engineering interests and technical management experience to become an integral part of our team. The work: Oversee a group of top-tier developers creating the backbone infrastructure of the Rapido's platform and make self-guided code contributions Hands on with in-depth technical understanding of NoSQL databases, distributed fault tolerant systems Design efficient, flexible data models and protocols for defining home automation devices on many different platforms Get involved in a diverse, cross-functional team spanning hardware, mobile apps, and cloud services Collaborate with other engineering and product teams in the organization to pull together diverse system requirements Help define and facilitate architectural vision throughout the organization Minimum qualifications: Bachelor’s degree in Computer Science, Mathematics, Statistics, or related field or equivalent experience 3+ year's of backend platform/cloud software development experience 2+ year's of engineering management experience Preferred qualifications: Understanding of multiple programming languages Experience with system-wide formal data modeling and protocol definition in distributed systems Knowledge of good API design and abstraction concepts Demonstrated achievement of delivering enterprise-grade software platforms Proficiency in designing, developing, and debugging distributed systems Experience working with cross-functional teams including backend, apps, and test engineering Excellent problem-solving, organizational, and critical thinking skills Exceptional verbal and written communication skills with the ability to effectively influence and communicate cross-functionally with all levels of the organization Experience working with external development teams/contractors to facilitate implementation
Byte Prophecy is looking for Data Engineers to build a critical piece of the enterprise data pipeline in our platform MonitorFirst. Candidates should - Have at least 1-2 years of relevant experience in any of the following technologies in our Data Pipeline [#ETL Tools, #Kafka, #Spark, #Cassandra, #Scala or #Python] - Be hands-on and proficient in #Java, #Scala and #SQL - Have strong fundamentals in data structures, algorithms and distributed systems - Experienced in product engineering and production-ready data pipelines is preferred Candidates will - Work in an agile environment as small focused teams - Need to be proactive and goal-oriented - Enjoy working as a team-player About Byte Prophecy We are an enterprise analytics platform company that helps some of the largest companies in India make key business decisions every day. As a unique single platform encompassing collection, transformation, processing, augmented analytics and automated alerts, we've been getting great traction from key stakeholders in the enterprise ecosystem. For our next round of growth, we are looking to hire Data Engineers and Product Analysts for our office in Ahmedabad. Please send your CVs to email@example.com Thanks!
Engineering Lead - Fluent in either Python, NodeJS. Has worked on either MySql, Postgresql, MongoDB Experience working with Payment systems, workflow management systems, chat systems is a plus. Has working knowledge on the complete development stack Fluent with AWS, GIT Knowledge on Continuous Integration ( CI ), Automated Deployment tools is a plus Previous experience of leading a team, architecting is a plus Open Source contributions is a plus
We are looking for an experienced Java developer who will help us build scalable REST API based backend using Microservices. Key skills - Own the product functionality and work with the technical and product leadership to convert ideas into great product - Stay abreast of latest back-end technologies and patterns and proactively find ways to apply them to the business problem - Thorough understanding of core Java, Spring framework - Experience with Spring Boot to bootstrap applications - Good understanding and working experience with RESTful web services - Knowledge of distributed systems and how they are different from traditional monolith applications You get additional brownie points if you have - Knowledge of modern authorization mechanisms, such as JSON Web Token and OAuth2 - Familiarity with code versioning tools such as Git etc - Self-starter who can think outside of the box, and come up with a solution to resolve and mitigate complex problems - Experience working in Agile development environment using methodologies like Scrum and tools like JIRA, Confluence etc Experience - 4-7 years of work experience developing Java based backend applications - Around 1 year of work experience e using Spring Boot, Spring Cloud and Microservices - BE/B Tech or higher preferably in Computer Science About Us QUp is a leading healthcare product that is excited to offer a “Painless OPD” experience to patients and health care providers like doctors, hospitals etc. We are a fast growing startup that is using innovation and cutting edge technologies to solve the OPD management problem. We offer competitive salary, freedom to explore cutting edge tools & technologies, flat hierarchy & open communication channels to our people so that they continue to be growth drivers for the company.
We are looking for engineers who believe in challenging the status quo and are ready to be a part of this change. If you are the one who is looking to take a leap of faith and work on technologies of the future, if you don’t want to be yet another programmer writing an API for an already "billion dollar startup”, then we are looking for you. What do we expect from you? You have strong theoretical fundamentals required to build large-scale distributed systems focusing especially on streaming and unstructured data You have a deep understanding of concepts like performance, availability, and fault tolerance. You are passionate about solving real business problems using technology. You can drive technology products to solve business impact end-to-end You are familiar with best engineering practices, code reviews, and deployment strategies You understand the pressures and expectations that come along with working with an early stage startup
DevOps Architect, responsible for designing & implementing the Devops related work task and clarify the System/Deployment related issue directly with customer
Work on different POC Experience in Java/J2ee programming and coding. many more ..
JOB REQUIREMENTS: Minimum 3-5 Years of Experience in Software Development Strong Fundamentals of Data Structures and Algorithms Experience in python/Cassandra/spark/MongoDB/ Kafka, ActiveMQ Understanding of microservice or distributed architecture Understanding of async programming. Knowledge in handling messaging services like pubsub or streaming (Eg: Kafka, ActiveMQ, RabbitMQ) Understanding of end to end development including deployment and monitoring. Worked with SQL & NoSQL databases. Has good debugging skills Has good analytical & problem-solving skills
Should have strong knowledge in Java, Node JS, Apache Spark, web service, REST API website, R and Databases ( SQL or No SQL) · Should be strong in multi-threading · Good understanding of machine learning and big data · Hands on experience in of cloud Platform like Azure, AWS and bluemix. · Good understanding of protocols like TCP,UDP,FTP,MQTT and Kafka · Having hands on experience on following areas o Should have knowledge on IOT cloud and multiple verticals ( Telecom, Solar, EMS, Smart City, Industrial ) o Experience in developing and deploying web and worker roles, web sites, and using queues o Experience in developing highly scaling and performing services . o Experience in using MYSQL Mongo DB, and any other No SQL databases o Experience in caching techniques in azure PAAS, preferably Redis cache o Basic knowledge of analytics o Hands on experience in API management is added value · Self-motivated, independent and quick learner
ABOUT MOOSHAK We're at a point where the urban English-speaking Indian population is almost all online.The next billion Indians online all communicate via Indian languages. Mooshak was created with the singular aim of making the Internet fun and relevant for this large, untapped segment. At Mooshak, we want to connect and engage Indians in their own language. And that presents problems in various domains, from creativity in content creation, to creating a highly scalable platform, to applying techniques in AI and NLP in Indian languages to understand what people are saying and react to what they want. Mooshak is architected to scale. Irrespective of the number of followers, the read time for a feed remains constant. We achieve this by using distributed message queues and a distributed computing engine and some nifty caching! TECHNICAL RESPONSIBILITIES Mooshak’s Tech Stack Java Node.js Mongo DB Redis Apache Kafka & Apache Storm Nginx / Jenkins Server Developer’s Roles and Responsibilities You are expected to know at least 4 of these technologies with the ability to quickly learn the others. You will play the leading role in all stages of server development Architecture Coding Final testing Shipping The APIs are written and the product works fine. You are expected to understand the architecture and enhance product functionality. Sometimes you may be required to double up as the Dev Ops guy should the servers fail or the product not be working as expected. The core APIs are written in Node.js The distributed message queue (Kafka) and compute engine (Storm) are implemented in Java. Understanding of Angular 2 is a big plus as our Web app is built on the same. NON TECHNICAL RESPONSIBILITIES We are a startup. This means that: You will be expected to be someone who comes up with solutions instead of problems. You will be expected to work non stop including weekends if the servers crash. But otherwise we are quite chill! You will be expected to talk to multiple stakeholders customers, designer, client side developer to achieve user and business needs. A high aptitude and a positive attitude are a must You should be comfortable working independently as well as in a team. We are a lean team right now, with you as the only server developer (assisted by the folks who built the platform) JOB LOCATION You would be working out of our office in Pune. You may be required to travel occasionally to Bangalore where our previous tech team sits
Position Description Assists in providing guidance to small groups of two to three engineers, including offshore associates, for assigned Engineering projects Demonstrates up-to-date expertise in Software Engineering and applies this to the development, execution, and improvement of action plans Generate weekly, monthly and yearly report using JIRA and Open source tools and provide updates to leadership teams. Proactively identify issues, identify root cause for the critical issues. Work with cross functional teams, Setup KT sessions and mentor the team members. Co-ordinate with Sunnyvale and Bentonville teams. Models compliance with company policies and procedures and supports company mission, values, and standards of ethics and integrity Provides and supports the implementation of business solutions Provides support to the business Troubleshoots business, production issues and on call support. Minimum Qualifications BS/MS in Computer Science or related field 8+ years’ experience building web applications Solid understanding of computer science principles Excellent Soft Skills Understanding the major algorithms like searching and sorting Strong skills in writing clean code using languages like Java and J2EE technologies. Understanding how to engineer the RESTful, Micro services and knowledge of major software patterns like MVC, Singleton, Facade, Business Delegate Deep knowledge of web technologies such as HTML5, CSS, JSON Good understanding of continuous integration tools and frameworks like Jenkins Experience in working with the Agile environments, like Scrum and Kanban. Experience in dealing with the performance tuning for very large-scale apps. Experience in writing scripting using Perl, Python and Shell scripting. Experience in writing jobs using Open source cluster computing frameworks like Spark Relational database design experience- MySQL, Oracle, SOLR, NoSQL - Cassandra, Mango DB and Hive. Aptitude for writing clean, succinct and efficient code. Attitude to thrive in a fun, fast-paced start-up like environment
Couture.ai is building a patent-pending AI platform targeted towards vertical-specific solutions. The platform is already licensed by Reliance Jio and few European retailers, to empower real-time experiences for their combined >200 million end users. The founding team consists of BITS Pilani alumni with experience of creating global startup success stories. The core team, we are building, consists of some of the best minds in India in artificial intelligence research and data engineering. We are looking for multiple different roles with 2-7 year of research/large-scale production implementation experience with: - Rock-solid algorithmic capabilities. - Production deployments for massively large-scale systems, real-time personalization, big data analytics, and semantic search. - Or credible research experience in innovating new ML algorithms and neural nets. Github profile link is highly valued. For right fit into the Couture.ai family, compensation is no bar.
Couture.ai is building a patent-pending AI platform targeted towards vertical-specific solutions. The platform is already licensed by Reliance Jio and few European retailers, to empower real-time experiences for their combined >200 million end users. For this role, credible display of innovation in past projects is a must. We are looking for hands-on leaders in data engineering with the 5-11 year of research/large-scale production implementation experience with: - Proven expertise in Spark, Kafka, and Hadoop ecosystem. - Rock-solid algorithmic capabilities. - Production deployments for massively large-scale systems, real-time personalization, big data analytics and semantic search. - Expertise in Containerization (Docker, Kubernetes) and Cloud Infra, preferably OpenStack. - Experience with Spark ML, Tensorflow (& TF Serving), MXNet, Scala, Python, NoSQL DBs, Kubernetes, ElasticSearch/Solr in production. Tier-1 college (BE from IITs, BITS-Pilani, IIITs, top NITs, DTU, NSIT or MS in Stanford, UC, MIT, CMU, UW–Madison, ETH, top global schools) or exceptionally bright work history is a must. Let us know if this interests you to explore the profile further.
We are looking for a full time senior resource to lead the python driven dev team. Candidate will be responsible for design, dev and taking to production highly scalable applications. Opportunity to work on leading ML frameworks as well as custom built frameworks for enhanced financial analytics. Crediwatch is an automated & intelligent data curation platform which helps businesses make faster and smarter decisions. Crediwatch aids sophisticated credit and other risk assessment models by providing data intelligence, predictive analysis, decision enabling technologies which maximises customer profitability and performance. Crediwatch has received accolades in Citibank Tech4Integrity challenge (worldwide), Barclays Rise accelerator and Tech30 by YourStory to name a few. We are based in the heart of Bangalore and are growing fast.
RESPONSIBILITIES: 1. Full ownership of Tech right from driving product decisions to architect to deployment. 2. Develop cutting edge user experience and build cutting edge technology solutions like instant messaging in poor networks, live-discussions, live-videos optimal matching. 3. Using Billions of Data Points to Build User Personalisation Engine 4. Building Data Network Effects Engine to increase Engagement & Virality 5. Scaling the Systems to Billions of Daily Hits. 6. Deep diving into performance, power management, memory optimisation & network connectivity optimisation for the next Billion Indians 7. Orchestrating complicated workflows, asynchronous actions, and higher order components 8. Work directly with Product and Design teams REQUIREMENTS: 1. Should have Hacked some (computer or non-computer) system to your advantage. 2. Built and managed systems with a scale of 10Mn+ Daily Hits 3. Strong architectural experience 4. Strong experience in memory management, performance tuning and resource optimisations 5. PREFERENCE- If you are a woman or an ex-entrepreneur or having a CS bachelor’s degree from IIT/BITS/NIT P.S. If you don't fulfil one of the requirements, you need to be exceptional in the others to be considered.
Looking for a technically sound and excellent trainer on big data technologies. Get an opportunity to become popular in the industry and get visibility. Host regular sessions on Big data related technologies and get paid to learn.
Securonix is a Big Data Security Analytics product company. The only product which delivers real-time behavior analytics (UEBA) on Big Data.
Our company is working on some really interesting projects in Big Data Domain in various fields (Utility, Retail, Finance). We are working with some big corporates and MNCs around the world. While working here as Big Data Engineer, you will be dealing with big data in structured and unstructured form and as well as streaming data from Industrial IOT infrastructure. You will be working on cutting edge technologies and exploring many others while also contributing back to the open-source community. You will get to know and work on end-to-end processing pipeline which deals with all type of work like storing, processing, machine learning, visualization etc.
www.aaknet.co.in/careers/careers-at-aaknet.html You are extra-ordinary, a rock-star, hardly found a place to leverage or challenge your potential, did not spot a sky rocketing opportunity yet? Come play with us – face the challenges we can throw at you, chances are you might be humiliated (positively); do not take it that seriously though! Please be informed, we rate CHARACTER, attitude high if not more than your great skills, experience and sharpness etc. :) Best wishes & regards, Team Aak!
Develop analytic tools, working on BigData and Distributed systems. - Provide technical leadership on developing our core Analytic platform - Lead development efforts on product features using Scala/Java -Demonstrable excellence in innovation, problem solving, analytical skills, data structures and design patterns - Expert in building applications using Spark and Spark Streaming -Exposure to NoSQL: HBase/Cassandra, Hive and Pig -Latin, Mahout -Extensive experience with Hadoop and Machine learning algorithms