Primary responsibility is to take up product development as a full stack developer and ability to work with the larger product team, arrive at best architectural approach, choose most appropriate frameworks, code and deliver Shown success as an individual contributor. Should have consistently demonstrated ability and commitment to deliver major initiatives in a timely manner Embrace good development practices including design specification, coding standards, unit testing and code reviews A selfstarter, who loves to take on hard problems, loves solving service scalability problems, enjoys breaking things and enthusiastic to learn new technologies and working in startup environments
3. Key Result Areas · Create and maintain optimal data pipeline, · Assemble large, complex data sets that meet functional / non-functional business requirements. · Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. · Keep our data separated and secure · Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. · Build analytics tools that utilize the data pipeline to provide actionable insights into key business performance metrics. · Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. · Work with data and analytics experts to strive for greater functionality in our data systems 4. Knowledge, Skills and Experience Core Skills: We are looking for a candidate with 7+ years of experience in a Data Engineer role. They should also have experience using the following software/tools: · Experience in developing Big Data applications using Spark, Hive, Sqoop, Kafka, and Map Reduce · Experience with stream-processing systems: Spark-Streaming, Strom etc. · Experience with object-oriented/object function scripting languages: Python, Scala etc · Experience in designing and building dimensional data models to improve accessibility, efficiency, and quality of data · Should be proficient in writing Advanced SQLs, Expertise in performance tuning of SQLs. Experience with data science and machine learning tools and technologies is a plus · Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. · Experience with Azure cloud services is a plus · Financial Services Knowledge is a plus
What is the Job like? You will be working closely with Java backend Team Breaking down complex requirements into simpler stories. Working with various stakeholders and helping convert requirements to code. Managing, mentoring and reviewing engineers for their technical contribution. Participating actively in hiring and nurturing of talent. Your focus will be on delivering products in a timely manner with high quality. Familiarity with multiple software development practices and tools, and the proven ability to adapt os expected as well. Required expectations for the Job Role: 5+ years of experience building scalable systems including at least 2 years of direct people management experience. Worked on large scale Java, Spring, Hibernate applications with a good understanding of web stack. Good understanding of nuances of micro-services systems and REST APIs. Good understanding of relational databases preferably - MySQL. Worked with Message Brokers like Rabbit MQ, Apache Kafka and Application Containers like Docker. Analyze, design and architect, develop and maintain software solutions across multiple projects. You’re should be able to communicate effectively among and between stakeholder groups You will be working closely with Java backend Team
Job Description We are looking for a Data Engineer that will be responsible for collecting, storing, processing, and analyzing huge sets of data that is coming from different sources. Responsibilities Working with Big Data tools and frameworks to provide requested capabilities Identify development needs in order to improve and streamline operations Develop and manage BI solutions Implementing ETL process and Data Warehousing Monitoring performance and managing infrastructure Skills Proficient understanding of distributed computing principles Proficiency with Hadoop and Spark Experience with building stream-processing systems, using solutions such as Kafka and Spark-Streaming Good knowledge of Data querying tools SQL and Hive Knowledge of various ETL techniques and frameworks Experience with Python/Java/Scala (at least one) Experience with cloud services such as AWS or GCP Experience with NoSQL databases, such as DynamoDB,MongoDB will be an advantage Excellent written and verbal communication skills
Products@DataWeave: We, the Products team at DataWeave, build data products that provide timely insights that are readily consumable and actionable, at scale. Our underpinnings are: scale, impact, engagement, and visibility. We help businesses take data driven decisions everyday. We also give them insights for long term strategy. We are focused on creating value for our customers and help them succeed. How we work It's hard to tell what we love more, problems or solutions! Every day, we choose to address some of the hardest data problems that there are. We are in the business of making sense of messy public data on the web. At serious scale! Read more on Become a DataWeaver What do we offer? - Opportunity to work on some of the most compelling data products that we are building for online retailers and brands. - Ability to see the impact of your work and the value you are adding to our customers almost immediately. - Opportunity to work on a variety of challenging problems and technologies to figure out what really excites you. - A culture of openness. Fun work environment. A flat hierarchy. Organization wide visibility. Flexible working hours. - Learning opportunities with courses, trainings, and tech conferences. Mentorship from seniors in the team. - Last but not the least, competitive salary packages and fast paced growth opportunities. Roles and Responsibilities: ● Build a low latency serving layer that powers DataWeave's Dashboards, Reports, and Analytics functionality ● Build robust RESTful APIs that serve data and insights to DataWeave and other products ● Design user interaction workflows on our products and integrating them with data APIs ● Help stabilize and scale our existing systems. Help design the next generation systems. ● Scale our back end data and analytics pipeline to handle increasingly large amounts of data. ● Work closely with the Head of Products and UX designers to understand the product vision and design philosophy ● Lead/be a part of all major tech decisions. Bring in best practices. Mentor younger team members and interns. ● Constantly think scale, think automation. Measure everything. Optimize proactively. ● Be a tech thought leader. Add passion and vibrancy to the team. Push the envelope. Skills and Requirements: ● 5-7 years of experience building and scaling APIs and web applications. ● Experience building and managing large scale data/analytics systems. ● Have a strong grasp of CS fundamentals and excellent problem solving abilities. Have a good understanding of software design principles and architectural best practices. ● Be passionate about writing code and have experience coding in multiple languages, including at least one scripting language, preferably Python. ● Be able to argue convincingly why feature X of language Y rocks/sucks, or why a certain design decision is right/wrong, and so on. ● Be a self-starter—someone who thrives in fast paced environments with minimal ‘management’. ● Have experience working with multiple storage and indexing technologies such as MySQL, Redis, MongoDB, Cassandra, Elastic. ● Good knowledge (including internals) of messaging systems such as Kafka and RabbitMQ. ● Use the command line like a pro. Be proficient in Git and other essential software development tools. ● Working knowledge of large-scale computational models such as MapReduce and Spark is a bonus. ● Exposure to one or more centralized logging, monitoring, and instrumentation tools, such as Kibana, Graylog, StatsD, Datadog etc. ● Working knowledge of building websites and apps. Good understanding of integration complexities and dependencies. ● Working knowledge linux server administration as well as the AWS ecosystem is desirable. ● It's a huge bonus if you have some personal projects (including open source contributions) that you work on during your spare time. Show off some of your projects you have hosted on GitHub.
What will you do? You will be responsible for the design and development of large-scale, multi-tenant, distributed systems using scalable, fault-tolerant architecture with distributed queues (kafka), distributed caches (redis), high volume data stores (MongoDB, Cassandra, elasticsearch), container-centric deployments (kubernetes). Write Java codes using best practices with high quality standards. Participate in code reviews and deep dive into design discussions.
Can you please directly apply here: locale.freshteam.com/jobs What would you spend most of your time doing?As a software engineer at an early stage startup, you will be responsible for laying the foundation of all engineering systems. Your day might begin with designing a new micro-service supposed to handle 500 million pings on its first day in production and end with fierce debates on coding guidelines or the best practices for handling data consistency across distributed systems.Being an enterprise-focused company, our systems don't scale linearly or even exponentially. Every new customer brings the scale of millions of customers that they serve. We need to build robust, scale-ready and fault-tolerant services from day one. Our clients rely on it.Best for someone who is:1. A polyglot, fluent in system design principles and not in a particular language or framework. It will be your responsibility to evaluate all available options and pick the best one for the job.2. Passionate to move fast without breaking things and insists on rigorous testing.3. Excited to own the outcome of what (s)he builds while clearly communicating the steps to get there.If you are looking to spend your 20s learning how to build a company from scratch, if building systems at scale excites you, if you are mesmerized by what the world of location can offer or if you are passionate about zero-to-one, we will see you on the other side? :)
As a Golang Developer, you will be part of our core team helping us build cutting edge products for our clientsOUTLINE OF THE TECHNICAL CAPABILITIES REQUIRED: - Experience working with Golang.- Experience building microservices using Golang is good to have.- Prior experience in building web apps from the ground up is a huge plus.- Experience working with Docker, Docker Compose & Kubernetes is a plus.- We have a special love for Developers with experience in building distributed systems.- Prior experience in system architecture is a huge plus.- If you have worked on Scalable Enterprise Architectures previously, you are best fitted for the teamContributions to FOSS, StackOverflow and GitHub profile with your side projects, if available will definitely be an added advantage. Knowledge of Machine Learning/AI Concepts is a big plus.Please mention your current and expected CTC, Notice Period/Date of Availability while applying.
About the role:As an Engineering Manager, your role would involve architecting systems capable of serving as the brains of complex distributed products. In addition, you’d also closely Managing engineers on the team and contribute to team building.A strong technologist at Meesho cares about code modularity, scalability, re-usability and thrives in a complex and ambiguous environment. Required skill & Experience: Bachelors / Masters in Computer Science or equivalent from a premier institute with at least 8+ years over all professional experience. At-least 2+ years experience in managing/leading software development teams. Create clear career paths for team members and help them grow with regular & deep mentoring. Perform regular performance evaluation and share and seek feedback. Able to drive sprints and OKRs. Exceptional team managing skills; experience in building large scale distributed systems Experience in Scalable Systems - transactional systems (B2C) Expertise in Java/J2EE and multithreading Deep understanding of transactional and NoSQL DBs Deep understanding of Messaging systems - kafka Good experience on cloud infrastructure - AWS preferably Good to have: Data pipelines, ES Ability to think and analyze both breadth-wise and depth-wise while designing and implementing services Excellent teamwork skills, flexibility, and ability to handle multiple tasks.
Design and development of the Supply chain applications for the retail customers, making use of the open source technologies. It can be taking our own product and customizing as per the customer requirements or developing applications from the scratch.
- 3+ years of experience in building complex, highly scalable, high volume, low latency Enterprise applications using languages such as Java, NodeJS, Go and/or Scala - Strong experience in building microservices using technologies like Spring Boot, Spring Cloud, Netflix OSS, Zuul - Deep understanding on microservices design patterns, service registry and discovery, externalization of configurations - Experience in message streaming and processing technologies such as Kafka, Spark, Storm, gRPC or other equivalent technologies - Experience with one or more reactive microservice tools and techniques such as Akka, Vert.x, ReactiveX - Strong experience in creation, management and consumption of REST APIs leveraging Swagger, Postman, API Gateways (such as MuleSoft, Apigee) etc; - Strong knowledge in data modelling, querying, performance tuning of any big-data stores (MongoDB, Elasticsearch, Redis etc;) and /or any RDBMS (Oracle, PostgreSQL, MySQL etc;) - Experience working with Agile / Scrum based teams that utilizes Continuous Integration/Continuous Delivery processes using Git, Maven, Jenkins etc; - Experience in Containers (Docker/Kubernetes) based deployment and management - Experience in using AWS/GCP/Azure based cloud infrastructure - Knowledge in test Driven Development and test automation skills with Junit/TestNG - Knowledge in security frameworks, concepts and technologies like Spring Security, OAuth2, SAML, SSO, Identity and Access Management
Key skill set : Apache NiFi, Kafka Connect (Confluent), Sqoop, Kylo, Spark, Druid, Presto, RESTful services, Lambda / Kappa architectures Responsibilities : - Build a scalable, reliable, operable and performant big data platform for both streaming and batch analytics - Design and implement data aggregation, cleansing and transformation layers Skills : - Around 4+ years of hands-on experience designing and operating large data platforms - Experience in Big data Ingestion, Transformation and stream/batch processing technologies using Apache NiFi, Apache Kafka, Kafka Connect (Confluent), Sqoop, Spark, Storm, Hive etc; - Experience in designing and building streaming data platforms in Lambda, Kappa architectures - Should have working experience in one of NoSQL, OLAP data stores like Druid, Cassandra, Elasticsearch, Pinot etc; - Experience in one of data warehousing tools like RedShift, BigQuery, Azure SQL Data Warehouse - Exposure to other Data Ingestion, Data Lake and querying frameworks like Marmaray, Kylo, Drill, Presto - Experience in designing and consuming microservices - Exposure to security and governance tools like Apache Ranger, Apache Atlas - Any contributions to open source projects a plus - Experience in performance benchmarks will be a plus
Hands-on programming and technical design skills with a passion for learning new technologies Experience of building highly scalable, robust, and fault-tolerant services 3+ years of experience of designing and developing software systems or services Good understanding of REST APIs and the web in general. Ability to build a feature from scratch & drive it to completion. Working experience with AWS Knowledge of designing microservices Knowledge of search platform like Elastic Search, Solr etc Knowledge of messaging technology such as Kafka or rabbitmq Startup experience is a strong plus Experience of Python is a strong plus Critical thinking is a plus.
Please apply if and only if you enjoy engineering, wish to write a lot of code, wish to do a lot of hands-on Python experimentation, already have in-depth knowledge of deep learning. This position is strictly for people having knowledge various Neural networks and can customize the neural network and not for people who have experience in downloading various AI/ML code.This position is not for freshers. We are looking for candidates with AI/ML/CV experience of at least 4 year in the industry.
Responsibilities Work with the team to create backend services by translating application storyboards and use cases into functional applications Design, build and maintain efficient, reusable, and reliable Python code Ensure the best possible performance, quality, and responsiveness of the applications Identify bottlenecks and bugs and devise solutions to these problems Help maintain code quality, organization, and automatization Skills Deep understanding of how RESTful APIs work Familiar with various design and architectural patterns Sound knowledge of Databases MongoDB is a must, Work-experience in Python, with knowledge of Flask Framework. Knowledge of user authentication and authorization between multiple systems, servers, and environments Strong unit test and debugging skills Ability to communicate complex technical concepts to both technical and non-technical audiences Experience with Queue based streaming systems like Kafka, Celery Understanding of fundamental design principles behind a scalable application Able to create database schemas that represent and support business processes Proficient understanding of code versioning tools, such as Git We expect an entrepreneurial mindset, someone who is not afraid to take on new challenges every day and who considers the product as his own by taking complete ownership of it.
Systems EngineerAbout Intellicar Telematics Pvt LtdIntellicar Telematics Private Limited is a vehicular telematics organization founded in 2015 with the vision of connecting businesses and customers to their vehicles in a meaningful way. We provide vehicle owners with the ability to connect and diagnose vehicles remotely in real-time. Our team consists of individuals with an in-depth knowledge and understanding in automotive engineering, driver analytics and information technology. By leveraging our expertise in the automotive domain, we have created solutions to reduce operational and maintenance costs of large fleets, and ensure safety at all times.Solutions :- Enterprise Fleet Management, GPS Tracking- Remote engine diagnostics, Driver behavior & training- Technology Integration : GIS, GPS, GPRS, OBD, WEB, Accelerometer, RFID, On-board Storage.Intellicar's team of accomplished automotive Engineers, hardware manufacturers, Software Developers and Data Scientists have developed the best solutions to track vehicles and drivers, and ensure optimum performance, utilization and safety at all times.We cater to the needs of our clients across various industries such as: Self drive cars, Taxi cab rentals, Taxi cab aggregators, Logistics, Driver training, Bike Rentals, Construction, ecommerce, armored trucks, Manufacturing, dealership and more. Desired skills as a developer :- Education: BE/B.Tech in Computer Science or related field.- 4+ years of experience with scalable distributed systems applications and building scalable multi-threaded server applications.- Strong programming skills in Java or Scala on Linux or a Unix based OS.- Understanding of distributed systems like Hadoop, Spark, Cassandra, Kafka.- Good understanding of HTTP, SQL, Database internals.- Good understanding of Internet and how it works- Create new features from scratch, enhance existing features and optimize existing functionality, from conception and design through testing and deployment.- Work on projects that make our network more stable, faster, and secure.- Work with our development QA and system QA teams to come up with regression tests that cover new changes to our software