This position is part of the Analytics product Engineering Productivity team. Engineering Productivity works closely with other engineers, data scientists, product teams, and many others to not only increase our systems’ scalability and reliability, but also enable the rapid development of new feature code for our customers. As a software engineer, you will play an integral role in building and maintaining anomaly detection integrations, automating engineering process workflows, assisting with monitoring and instrumentation governance, design, and implementation, and much more. Roles and Responsibilities: - Object oriented programming experience in Python - Experience with stream-based passed processing models, distributed streaming platforms like Kafka or control theory - Experience with SQL (MySQL, Oracle, or PostgreSQL) - Solid Computer Science fundamentals with regards to data structures, algorithms, time complexity, etc. Key Competencies and Skills: - Solid understanding of statistics and probability - Experience optimizing and debugging highly performant Python applications is mandatory - Experience developing and scaling RESTful web services - Former work with column stores (Vertica) and NoSQL (Redis, Aerospike) -Streaming technology:Kafka/Kubernetes experience is mandatory Education and Qualifications: BA/BS degree and 4+ years of experience OR MS degree and 2+ years of experience in software engineering (Degree in Computer Science or related field preferred) OR equivalent experience in software development
We are looking for exceptional and ambitious frontend developers for a product startup. As a backend developer, you will be working on design & developing the product. We are looking for candidates with an exceptional level of coding experience and people who have worked in a product startup. You will need to be self-driven and eager to learn new technologies, programming languages & tools. Skills & Requirements - Ability to work independently, accountable for your own actions and able to act with both urgency and integrity. - Experience in Python, NodeJS - Experience on test drive development and frameworks - Familiarity with distributed revision control systems such as Git, continuous integration tools like Jenkins, Travis, CircleCI. - Experience in Swift, Objective C, Kotlin, Java is a plus
Looking to hire Senior Backend Developer in the technical team: Good experience in Node.js and GoLang. Also experience in handling a team.
Atleast 2 yrs of experience in Node.js and GoLang.
1.Lead & deliver products thats are business need 2.Analyse the business and develop products that are market fit 3.Implement the best system architecture & technology into development 4.Reports creation & presenting to the business team & Management 5.Identify & mitigate the project risks 6.Should be flexible enough and be adaptable to the changing and varied work settings. 7.Should be clear & transparent in setting achievable targets & expectations to the team 8.Make yourself completely aware with all the technologies thoroughly especially the ones associated with software or application which is under construction.
Should have experience in API development using node Express JS and flask. Sound knowledge of database design, Data warehouse design and development. Should have experience with version control systems like git. Very good knowledge of design patterns, algorithms, applications development and system design. Should be able to create data pipeline using Technologies like airflow/luigi. Should be flexible to work on any database management systems like relational or no SQL. Experience on dealing with real time streaming data and Batch data using latest tech stack like kafka, spark and flink. Very good experience in designing microservices and experience in orchestration tools like kubernetes and docker Swarm.
Essenvia is building online platform to reduce the time and cost of bringing Medical Devices to market, and to streamline the Medical Device regulatory pathway. We are looking for a savvy Data Engineer to join our team based out of Bangalore. The hire will be responsible for creating proprietary data set for machine learning algorithms using various conventional and non conventional data sets. The Data Engineer will support our data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of becoming a key member in designing the data architecture to support our next generation of products and data initiatives and must be able to work on tight time line in start-up culture Responsibilities --------------------- Create and maintain optimal data pipeline architecture, Assemble large, complex data sets using various data sets. Transform the raw data as per functional requirements Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and NoSQL technologies. Keep the data separated and secured across multiple data centres and AWS regions. Work with data scientists to strive for greater functionality in our data systems. Mandatory Skills --------------------- Must have advanced working SQL knowledge and experience working with relational databases, and NoSQL Databases as well as working familiarity with a variety of databases. Must have strong exposure to database architectures and design Must have knowledge of algorithm and data structure Must have programming Knowledge in Python and Java Must have knowledge of Text mining/ Text extraction/ Regex matching Must have Knowledge of OCR Desirable Skills ----------------- Knowledge of Elastic search Knowledge of NLP Knowledge of cloud based architecture
Role : - Work closely with Architect to design, implement and deploy core Cloud Applications - Management and execution against project plans and delivery commitments - Troubleshoot and solve complex cloud issues. - Work with other teams (internal or external) to integrate your component or services to another application or service - Participate in design reviews, code reviews of your work and the work of your peer engineers. - Develops high scalable cloud application using object-oriented programming techniques. Role Requirements : - Proven hands-on Software Development experience - Hands on experience in designing and developing applications using Java EE platforms - Object Oriented analysis and design using common design patterns. - Experience in the Spring Framework - Experience of MVC, JDBC, RESTful and REST - Experience with build tools such as Ant, Maven, and Gradle - Experience with continuous integration - Understanding of Java and JEE internals (Classloading, Memory Management, Transaction management etc) - Write reusable Java Libraries - Excellent knowledge of Relational Databases, SQL and ORM technologies (JPA2, Hibernate) - Experience with test-driven development - Bachelor's degree in Computer Science, Engineering or a related subject - Development experience in Scala would be useful.
About Artivatic :- Artivatic is a technology startup that uses AI/ML/Deep learning to build intelligent products & solutions for finance, healthcare & insurance businesses. It is based out of Bangalore with 20+ team focus on technology. The artivatic building is cutting edge solutions to enable 750 Millions plus people to get insurance, financial access, and health benefits with alternative data sources to increase their productivity, efficiency, automation power, and profitability, hence improving their way of doing business more intelligently & seamlessly. - Artivatic offers lending underwriting, credit/insurance underwriting, fraud, prediction, personalization, recommendation, risk profiling, consumer profiling intelligence, KYC Automation & Compliance, automated decisions, monitoring, claims processing, sentiment/psychology behaviour, auto insurance claims, travel insurance, disease prediction for insurance and more. - We have raised US $300K earlier and built products successfully and also done few PoCs successfully with some top enterprises in Insurance, Banking & Health sector. Currently, 4 months away from generating continuous revenue.Skills : - Building server-side logic that powers our APIs, in effect deploying machine learning models in production system that can scale to billions of API calls - Scaling and performance tuning of database to handle billions of API calls and thousands of concurrent requests - Collaborate with data science team to build effective solutions for data collection, pre-processing and integrating machine learning into the workflow - Collaborate, provide technical guidance, and engage in design and code review for other team members. - Excellent Scala, cassandara, architect, api, software, python, Java programming and software design skills, including debugging, performance analysis and test design - Proficiency with at least one Scala, GoLang, Python micro frameworks like Flask, Tornado, Play, Spring etc. with experience in building REST APIs - Experience or understanding in building web crawlers, data fetching bots etc. - Experience with design and optimisation of Neo4j, cassandra, NoSQL databases, PostGreSQL, Redis, Elastic Search - Familiarity with one of the cloud service providers, AWS or Google Compute Engine - Computer Science degree with 4+ years of backend programming experience Experience : 3 Years+ Location : Sony World Signal, Koramangala 4th Block, Bangalore