Scala Jobs in Mumbai
Roles and Responsibilities
-
We are looking for a savvy Data Engineering professional to join the newly formed Data Engineering team
-
We are looking for Big Data specialists who have proven skills on working large-scale data systems
-
The hire will be responsible for building and optimizing data pipeline architectures, as well as optimizing data flow and collection for multiple source systems
-
The ideal candidate should be an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up
-
Have got the strong ability to build robust and resilient data pipelines which are fault-tolerant and reliable in terms of data movement
-
Should have experience in batch and stream data processing
-
Create end-to-end data products and productionise them in cloud/in-house servers
Technical Skills
-
Minimum 3-8 years of progressive experience building solutions in Big Data environments
-
Should have solid hands-on experience with Big Data technologies like Hadoop, HBase, Hive, Pig, Oozie, MapReduce, Yarn, HDFS, Zookeeper, and Apache Kafka.
-
Hands-on experience on Apache Spark, with Java/Scala for batch and stream processing, will be highly preferred
-
Minimum 6 months of hands-on experience in Apache Kafka.
-
Hands-on solid capabilities in SQL and NoSQL technologies
-
Should be able to build performant, fault-tolerant, scalable solutions
-
Excellent written and verbal communication skills
Skill- Spark and Scala along with Azure
Location - Pan India
Looking for someone Bigdata along with Azure
LogiNext is looking for a technically savvy and passionate Software Engineer - Data Science to analyze large amounts of raw information to find patterns that will help improve our company. We will rely on you to build data products to extract valuable business insights.
In this role, you should be highly analytical with a knack for analysis, math and statistics. Critical thinking and problem-solving skills are essential for interpreting data. We also want to see a passion for machine-learning and research.
Your goal will be to help our company analyze trends to make better decisions. Without knowledge of how the software works, data scientists might have difficulty in work. Apart from experience in developing R and Python, they must know modern approaches to software development and their impact. DevOps continuous integration and deployment, experience in cloud computing are everyday skills to manage and process data.
Responsibilities:
Identify valuable data sources and automate collection processes Undertake preprocessing of structured and unstructured data Analyze large amounts of information to discover trends and patterns Build predictive models and machine-learning algorithms Combine models through ensemble modeling Present information using data visualization techniques Propose solutions and strategies to business challenges Collaborate with engineering and product development teams
Requirements:
Bachelors degree or higher in Computer Science, Information Technology, Information Systems, Statistics, Mathematics, Commerce, Engineering, Business Management, Marketing or related field from top-tier school 2 to 3 year experince in in data mining, data modeling, and reporting. Understading of SaaS based products and services. Understanding of machine-learning and operations research Experience of R, SQL and Python; familiarity with Scala, Java or C++ is an asset Experience using business intelligence tools (e.g. Tableau) and data frameworks (e.g. Hadoop) Analytical mind and business acumen and problem-solving aptitude Excellent communication and presentation skills Proficiency in Excel for data management and manipulation Experience in statistical modeling techniques and data wrangling Able to work independently and set goals keeping business objectives in mind
LogiNext is looking for a technically savvy and passionate Senior Software Engineer - Data Science to analyze large amounts of raw information to find patterns that will help improve our company. We will rely on you to build data products to extract valuable business insights.
In this role, you should be highly analytical with a knack for analysis, math and statistics. Critical thinking and problem-solving skills are essential for interpreting data. We also want to see a passion for machine-learning and research.
Your goal will be to help our company analyze trends to make better decisions. Without knowledge of how the software works, data scientists might have difficulty in work. Apart from experience in developing R and Python, they must know modern approaches to software development and their impact. DevOps continuous integration and deployment, experience in cloud computing are everyday skills to manage and process data.
Responsibilities :
Adapting and enhancing machine learning techniques based on physical intuition about the domain Design sampling methodology, prepare data, including data cleaning, univariate analysis, missing value imputation, , identify appropriate analytic and statistical methodology, develop predictive models and document process and results Lead projects both as a principal investigator and project manager, responsible for meeting project requirements on schedule and on budget Coordinate and lead efforts to innovate by deriving insights from heterogeneous sets of data generated by our suite of Aerospace products Support and mentor data scientists Maintain and work with our data pipeline that transfers and processes several terabytes of data using Spark, Scala, Python, Apache Kafka, Pig/Hive & Impala Work directly with application teams/partners (internal clients such as Xbox, Skype, Office) to understand their offerings/domain and help them become successful with data so they can run controlled experiments (a/b testing) Understand the data generated by experiments, and producing actionable, trustworthy conclusions from them Apply data analysis, data mining and data processing to present data clearly and develop experiments (ab testing) Work with development team to build tools for data logging and repeatable data tasks tol accelerate and automate data scientist duties
Requirements:
Bachelor’s or Master’s degree in Computer Science, Math, Physics, Engineering, Statistics or other technical field. PhD preferred 4 to 7 years of experience in data mining, data modeling, and reporting 3+ years of experience working with large data sets or do large scale quantitative analysis Expert SQL scripting required Development experience in one of the following: Scala, Java, Python, Perl, PHP, C++ or C# Experience working with Hadoop, Pig/Hive, Spark, MapReduce Ability to drive projects Basic understanding of statistics – hypothesis testing, p-values, confidence intervals, regression, classification, and optimization are core lingo Analysis - Should be able to perform Exploratory Data Analysis and get actionable insights from the data, with impressive visualization. Modeling - Should be familiar with ML concepts and algorithms; understanding of the internals and pros/cons of models is required. Strong algorithmic problem-solving skills Experience manipulating large data sets through statistical software (ex. R, SAS) or other methods Superior verbal, visual and written communication skills to educate and work with cross functional teams on controlled experiments Experimentation design or A/B testing experience is preferred. Experince in team management.
- Big data development experience – Kafka, Hadoop
- Experience building data pipelines using Spark and/or Hive.
- Strong knowledge of Python
- Advanced proficiency in Scala, SQL, NoSQL
- strong in Database and Data Warehousing concepts
- Expertise in SQL, SQL tuning, schema design, Python and ETL processes
- Experience with Cloud Technologies required Azure, Data modeling,
Azure Databricks, Azure Data factory )
- Experience in working with Azure Data Lake, and Stream Analytics.
- Highly Motivated, Self-starter and quick learner
- Proficiency in Statistical procedures, Experiments and Machine Learning
techniques
- Must know the basics of data analytics and data modeling
- Excellent written and verbal communication skills
Roles/Responsibilities:
- Active involvement in the building of a recommendation engine
- Design new processes and builds large, complex data sets
- Conducts statistical modeling and experiment design
- Tests and validates predictive models.
- Build web prototypes and performs data visualization
- Generate algorithms and create computer models
- Should possess excellent analytical skills and troubleshooting ideas
- Should be aware of Agile Mode of operations and should have been
part of scrum teams.
- Should be open to work in the DevOps model with responsibilities of Dev
and Support both as the application goes live
- Should be able to work in shifts (if required)
- Should be open to working on fast-paced, projects with multiple
stakeholders.
We are hiring for Tier 1 MNC for the software developer with good knowledge in Spark,Hadoop and Scala
This company provides on-demand cloud computing platforms.
- 15+ years of Hands-on technical application architecture experience and Application build/ modernization experience
- 15+ years of experience as a technical specialist in Customer-facing roles.
- Ability to travel to client locations as needed (25-50%)
- Extensive experience architecting, designing and programming applications in an AWS Cloud environment
- Experience with designing and building applications using AWS services such as EC2, AWS Elastic Beanstalk, AWS OpsWorks
- Experience architecting highly available systems that utilize load balancing, horizontal scalability and high availability
- Hands-on programming skills in any of the following: Python, Java, Node.js, Ruby, .NET or Scala
- Agile software development expert
- Experience with continuous integration tools (e.g. Jenkins)
- Hands-on familiarity with CloudFormation
- Experience with configuration management platforms (e.g. Chef, Puppet, Salt, or Ansible)
- Strong scripting skills (e.g. Powershell, Python, Bash, Ruby, Perl, etc.)
- Strong practical application development experience on Linux and Windows-based systems
- Extra curricula software development passion (e.g. active open source contributor)
- Hands-on experience in any Cloud Platform
- Microsoft Azure Experience
Job Description:
TeamExtn is looking for a passionate Senior Scala Engineer. It will be expected from you to build pragmatic solutions on mission-critical initiatives. If you know your stuff, see the beauty in code, have knowledge in depth and breadth, advocate best practices, and love to work with distributed systems, then this is an ideal position for you.
As a core member of our Special Projects team, you will work on various new projects in a startup-like environment. These projects may include such things as building new APIs (REST/GraphQL/gRPC) for new products, integrating new products with core Carvana services, building highly scalable back end processing systems in Scala and integrating with systems backed by Machine Learning. You will use cutting edge functional Scala libraries such as ZIO. You will have the opportunity to work closely with our Product, Experience and Software Engineering teams to deliver impact.
Responsibilities:
- Build highly scalable APIs and back end processing systems for new products
- Contribute in the full software development lifecycle from design and development to testing and operating in production
- Communicate effectively with engineers, product managers and data scientists
- Drive scalability and performance within our distributed AI platform
- Full software development lifecycle from design and development to testing and operating in production
- Communicate effectively with engineers, product managers and data scientists
Skills And Experience:
- 4+ years experience with Scala, Java or other functional language
- Experience with Akka and Lightbend stack
- Expert with PostgreSQL, MySQL or MS SQL
- Experience in architecting, developing, deploying and operating large scale distributed systems and actor systems
- Experience with cloud APIs (e.g., GCP, AWS, Azure)
- Messaging systems such as GCP Pub/Sub, RabbitMQ, Kafka
- Strong foundation in algorithms and data structures and their real-world use cases.
- Solid understanding of computer systems and networks
- Production quality coding standards and patterns
BONUS SKILLS:
- Experience with functional programming in Scala
- Knowledge of ZIO and related ecosystem
- Experience with functional database libraries in Scala (Quill preferred)
- Kubernetes and Docker
- Elasticsearch
- Typescript, React and frontend UI development experience
- gRPC, GraphQL
- Professional experience in enterprise java software development using Spring MVC framework , RESTful APIs and SOA
- Experience working in Cloud(AWS)
- Outstanding problem solving skills
- API Development experience
- Exposure to monitoring tools such as ELK, Splunk
- Experience with Selenium for UI automated tests written in Cucumber or Scala
- Able to handle day-to-day challenges and owning the resolution of issues as they arise.
Experience :5--6+
Must Have
Job Description Data Engineer with Experience in the following area.
Location: PAN INDIA |
- Augmenting, improving, redesigning, and/or re-implementing Dolat's low-latency/high-throughput production trading environment, which collects data from and disseminates orders to exchanges around the world
- Optimizing this platform by using network and systems programming, as well as other advanced techniques
- Developing systems that provide easy access to historical market data and trading simulations
- Building risk-management and performance-tracking tools
- Shaping the future of Dolat through regular interviewing and infrequent campus recruiting trips
- Implementing domain-optimized data structures
- Learn and internalize the theories behind current trading system
- Participate in the design, architecture and implementation of automated trading systems
- Take ownership of system from design through implementation
Job Overview
We are looking for a Data Engineer to join our data team to solve data-driven critical
business problems. The hire will be responsible for expanding and optimizing the existing
end-to-end architecture including the data pipeline architecture. The Data Engineer will
collaborate with software developers, database architects, data analysts, data scientists and platform team on data initiatives and will ensure optimal data delivery architecture is
consistent throughout ongoing projects. The right candidate should have hands on in
developing a hybrid set of data-pipelines depending on the business requirements.
Responsibilities
- Develop, construct, test and maintain existing and new data-driven architectures.
- Align architecture with business requirements and provide solutions which fits best
- to solve the business problems.
- Build the infrastructure required for optimal extraction, transformation, and loading
- of data from a wide variety of data sources using SQL and Azure ‘big data’
- technologies.
- Data acquisition from multiple sources across the organization.
- Use programming language and tools efficiently to collate the data.
- Identify ways to improve data reliability, efficiency and quality
- Use data to discover tasks that can be automated.
- Deliver updates to stakeholders based on analytics.
- Set up practices on data reporting and continuous monitoring
Required Technical Skills
- Graduate in Computer Science or in similar quantitative area
- 1+ years of relevant work experience as a Data Engineer or in a similar role.
- Advanced SQL knowledge, Data-Modelling and experience working with relational
- databases, query authoring (SQL) as well as working familiarity with a variety of
- databases.
- Experience in developing and optimizing ETL pipelines, big data pipelines, and datadriven
- architectures.
- Must have strong big-data core knowledge & experience in programming using Spark - Python/Scala
- Experience with orchestrating tool like Airflow or similar
- Experience with Azure Data Factory is good to have
- Build processes supporting data transformation, data structures, metadata,
- dependency and workload management.
- Experience supporting and working with cross-functional teams in a dynamic
- environment.
- Good understanding of Git workflow, Test-case driven development and using CICD
- is good to have
- Good to have some understanding of Delta tables It would be advantage if the candidate also have below mentioned experience using
- the following software/tools:
- Experience with big data tools: Hadoop, Spark, Hive, etc.
- Experience with relational SQL and NoSQL databases
- Experience with cloud data services
- Experience with object-oriented/object function scripting languages: Python, Scala, etc.
Greetings from Amazon...!
It is our pleasure in personally inviting you to apply job with Amazon Development Centre India (ADCI). At Amazon we are inclined to hire people with passion for technology and you happened to be one of the shortlisted candidates. Our business is committed to recognizing potential and creating teams that embrace innovation.
Please find the Eligible criteria and requirements:
Job title : SDE – II (Software Development Engineer)
Role Opportunity : Permanent/Full Time/FTE/Regular
Work Location : Hyderabad/Bangalore/ Gurgaon
Must Have
- Strong Exposure to Data Structures, Algorithms, Coding, System Design (LLD, HLD, OOAD), Distributed systems, problem solving skills, Architecture (MVC/Microservices), logical thinking.
Amazon (ADCI) - If you are looking for an opportunity to solve deep technical problems and build innovative solutions in a fast paced environment working with smart, passionate software developers, this might be the role for you. Amazon’s transportation systems get millions of packages to customers worldwide faster and cheaper while providing world class customer experience – from checkout to shipment tracking to delivery. Our software systems include services that handle thousands or requests per second, make business decisions impacting billions of dollars a year, integrate with a network of small and large carriers worldwide, manage business rules for millions of unique products, and improve experience for millions of online shoppers. With rapid expansion into new geographies, innovations in supply chain, delivery models and customer experience, increasingly complex transportation network, ever expanding selection of products and growing number of shipments worldwide, we have an opportunity to build software that scales the business, leads the industry through innovation and delights millions of customers worldwide.
As an SDE, you will develop a deep understanding of our business, work closely with development teams and own the architecture and end-to-end delivery of software components.
About Amazon India:
Amazon teams in India work on complex business challenges to innovate and create efficient solutions that enable various Amazon businesses, including Amazon websites across the world as well as support Payments, Transportation, and Digital products and services like the Kindle family of tablets, e-readers and the store. We are proud to have some of the finest talent and strong leaders with proven experience working to make Amazon the Earth’s most customer-centric company.
We made our foray into the Indian market with the launch of Junglee.com, enabling retailers in India to advertise their products to millions of Indian shoppers and drive targeted traffic to their stores. In June 2013, we launched www.amazon.in for shoppers in India. With www.amazon.in , we endeavor to give customers more of what they want – low prices, vast selection, fast and reliable delivery, and a trusted and convenient online shopping experience. In just over a year of launching our India operations, we have expanded our offering to over 18 million products across 36 departments and 100s of categories! Our philosophy of working backwards from the customers is what drives our growth and success.
We will continue to strive to become a trusted and meaningful sales and logistics channel for retailers of all sizes across India and a fast, reliable and convenient online shopping destination for consumers. For us, it is always “Day 1” and we are committed to aggressively invest over the long-term and relentlessly focus on raising the bar for customer experience in India. Amazon India offers opportunities where you can dive right in, work with smart people on challenging problems and make an impact that contributes to the lives of millions. Join us so you can - Work Hard, Have Fun and Make History.
Basic Qualifications:
- 3+ years’ experience building successful production software systems
- A solid grounding in Computer Science fundamentals (based on a BS or MS in CS or related field)
- The ability to take convert raw requirements into good design while exploring technical feasibility tradeoffs
- Expertise in System design (design patterns, LLD, HLD, Solid principle, OOAD, Distributed systems etc..), Architecture (MVC/Micro services).
- Good understanding of at least some of the modern programming languages (Java) and open-source technologies (C++, Python, Scala, C#, PHP, Ruby etc..)
- Excellence in technical communication
- Has experience in mentoring other software developers
Preferred Qualifications:
- BS/MS in Computer Science or equivalent
- Experience developing service oriented architectures and an understanding of design for scalability, performance and reliability
- Demonstrated ability to mentor other software developers to maintain architectural vision and software quality
- Demonstrated ability to achieve stretch goals in a highly innovative and fast paced environment
- Expertise in delivering high-quality, innovative application
- Strong desire to build, sense of ownership, urgency, and drive
- Strong organizational and problem solving skills with great attention to detail
- Ability to triage issues, react well to changes, work with teams and ability to multi-task on multiple products and projects.
- Experience building highly scalable, high availability services
- The ideal candidate will be a visionary leader, builder and operator.
- He/she should have experience leading or contributing to multiple simultaneous product development efforts and initiatives.
- He/she needs to balance technical leadership with strong business judgment to make the right decisions about technology choices.
- He/she needs to be constantly striving for simplicity, and at the same time demonstrate significant creativity, innovation and judgment
- Proficiency in, at least, one modern programming language.
- Experience in SQL or Non-SQL database.
- Strong sense of ownership, urgency, and drive.
- Demonstrated leadership abilities in an engineering environment in driving operational excellence and best practices.
- Demonstrated ability to achieve stretch goals in a highly innovative and fast paced environment.
- Excellent communication, collaboration, reporting, analytical and problem solving skills.
Good to Have:
- Knowledge of professional software engineering practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations
- Experience with enterprise-wide systems
- Experience influencing software engineers best practices within your team
- Hands-on expertise in many disparate technologies, typically ranging from front-end user interfaces through to back-end systems and all points in between
- Strong written and verbal communication skills preferred
Key Points to remember:
- Strong knowledge of the Software Development Life Cycle methodology
- Technical design, development and implementation decisions on the use of technology in area(s) of specialization.
- Write or modify programming code to suit customer's needs.
- Unit test to assure meets requirements, including integration test as needed.
- Ability to understand and analyze issues and uses judgment to make decisions.
- Strong problem solving & troubleshooting skills
- Strong communication skills
- Responsible for self-development according to professional development plan
Understanding and solving real business needs at a large scale by applying your analytical problem-solving skills
Designing & building solutions for edge layers applications like GraphQL
Identifying and optimising performance bottlenecks
Architecting and building a robust, scalable, and highly available solutions for use cases like real time updates, data parsing and aggregation
Leading cross-functional initiatives and collaborating with engineers across teams
Must Have:
Hands on experience in Scala, NodeJs or Rust
A strong problem-solving skill and reasoning ability
Good to Have:
Experience in developing, performant & high throughput systems
Strong system designing skills, preferably in designing edge layer applications
Experience in functional programming, preferably with a working knowledge of type classes
Experience in writing testable programs.
Experience in working with the AWS stack
Prior experience with GraphQL
Experience in identifying and optimising hotspots and performance bottlenecks
An understanding of operating systems and networking fundamentals
Note: Applications accepted only from candidates who have worked in product based companies
Understand various raw data input formats, build consumers on Kafka/ksqldb for them and ingest large amounts of raw data into Flink and Spark.
Conduct complex data analysis and report on results.
Build various aggregation streams for data and convert raw data into various logical processing streams.
Build algorithms to integrate multiple sources of data and create a unified data model from all the sources.
Build a unified data model on both SQL and NO-SQL databases to act as data sink.
Communicate the designs effectively with the fullstack engineering team for development.
Explore machine learning models that can be fitted on top of the data pipelines.
Mandatory Qualifications Skills:
Deep knowledge of Scala and Java programming languages is mandatory
Strong background in streaming data frameworks (Apache Flink, Apache Spark) is mandatory
Good understanding and hands on skills on streaming messaging platforms such as Kafka
Familiarity with R, C and Python is an asset
Analytical mind and business acumen with strong math skills (e.g. statistics, algebra)
Problem-solving aptitude
Excellent communication and presentation skills
- Should be very strong Scala development(Coding)
- With Any combination of Java/Python/Spark/Bigdata
- 3+ years experience in Core Java/Scala with good understanding of multithreading
- The candidate must be good with Computer Science fundamentals
- Exposure to python/perl and Unix / K-Shell scripting
- Code management tools such as Git/Perforce.
- Experience with large batch-oriented systems
- DB2/Sybase or any RDBMS
- Prior experience with financial products, particularly OTC Derivatives
- Exposure to counterparty risk, margining, collateral or confirmation systems
Job description
Responsibilities
- Participate in requirements analysis
- Collaborate with internal teams to produce software design and architecture
- Write clean, scalable code using .NET programming languages
- Test and deploy applications and systems
- Revise, update, refactor and debug code
- Improve existing software
- Develop documentation throughout the software development life cycle (SDLC)
- Serve as an expert on applications and provide technical support
Requirements
- Proven experience as a .NET Developer or Application Developer
- Familiarity with the ASP.NET framework, SQL Server and design/architectural patterns (e.g. Model-View-Controller (MVC))
- Knowledge of at least one of the .NET languages (e.g. C#, Visual Basic .NET) and HTML5/CSS3
- Familiarity with architecture styles/APIs (REST, RPC)
- Understanding of Agile methodologies
- Excellent troubleshooting and communication skills
- Attention to detail
- BSc/BA in Computer Science, Engineering or a related field
No. Of Requirement: 3
Experience: 1-3 yrs
Location- Goregaon, Mumbai.
Urgent Joining.
Required Skills:
.Net Framework 3.5
C#
Visual Studio 2018
Debugging
Asp.net MVC
EntityFramework
Jquery
Bootstrap
HTML
CSS
Javascript
We are an upcoming profitable social enterprise and as a a part of the team we are looking for a candidate who can work with our team to build better analytics and intellegence into our platform Prabhaav.
We are looking for a Software Developer to build and implement functional programs. You will work with other Developers and Product Managers throughout the software development life cycle.
In this role, you should be a team player with a keen eye for detail and problem-solving skills. If you also have experience in Agile frameworks and popular coding languages (e.g. JavaScript).
Your goal will be to build efficient programs and systems that serve user needs.
Technical Skills we are looking for are:
- Producing clean, efficient code based on specifications
- Coding Abilities in HTML , PHP , JS , JSP – Server let , JAVA , DevOps(basic Knowledge).
- Additional Skills (preferred) : NodeJS , Python , Angular JS .
- System Administrator Experience : Linux (Ubuntu/RedHat) , Windows CE-Embedded.
- Data Base Experience : MySQL , Posgres , Mongo DB.
- Data Format Experience : JSON , XML , AJAX , JQuery.
- Should have Depth in software Architecture Design especially for Stand-Alone Software As Product , or SaaS Platform Experience.
- Should have Basic Experience/knowledge in Micro-Services , Rest API’s and SOAP methodologies.
- Should have built some backend architecture for Long Standing Applications.
- Good HTML Design Sense.
- Experience with AWS Services like EC2 and LightSail is Preferred.
- Testing and deploying programs and systems
- Fixing and improving existing software
- Good Understanding of OOP’s and Similar Concepts.
- Research on New JS Methodologies like React Js and Angular Js
Experience areas we are looking for:
- Proven experience as a Software Developer, Software Engineer or similar role
- Familiarity with Agile development methodologies
- Experience with software design and development in a test-driven environment
- Knowledge of coding languages (e.g. Java, JavaScript) and frameworks/systems (e.g. AngularJS, Git)
- Experience with databases and Object-Relational Mapping (ORM) frameworks (e.g. Hibernate)
- Ability to learn new languages and technologies
- Excellent communication skills
- Resourcefulness and troubleshooting aptitude
- Attention to detail
The Data Engineering team is one of the core technology teams of Lumiq.ai and is responsible for creating all the Data related products and platforms which scale for any amount of data, users, and processing. The team also interacts with our customers to work out solutions, create technical architectures and deliver the products and solutions.
If you are someone who is always pondering how to make things better, how technologies can interact, how various tools, technologies, and concepts can help a customer or how a customer can use our products, then Lumiq is the place of opportunities.
Who are you?
- Enthusiast is your middle name. You know what’s new in Big Data technologies and how things are moving
- Apache is your toolbox and you have been a contributor to open source projects or have discussed the problems with the community on several occasions
- You use cloud for more than just provisioning a Virtual Machine
- Vim is friendly to you and you know how to exit Nano
- You check logs before screaming about an error
- You are a solid engineer who writes modular code and commits in GIT
- You are a doer who doesn’t say “no” without first understanding
- You understand the value of documentation of your work
- You are familiar with Machine Learning Ecosystem and how you can help your fellow Data Scientists to explore data and create production-ready ML pipelines
Eligibility
Experience
- At least 2 years of Data Engineering Experience
- Have interacted with Customers
Must Have Skills
- Amazon Web Services (AWS) - EMR, Glue, S3, RDS, EC2, Lambda, SQS, SES
- Apache Spark
- Python
- Scala
- PostgreSQL
- Git
- Linux
Good to have Skills
- Apache NiFi
- Apache Kafka
- Apache Hive
- Docker
- Amazon Certification
Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 3+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow
Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 5+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow
What's the role?
Your role as a Principal Engineer will involve working with various team. As a principal engineer, will need full knowledge of the software development lifecycle and Agile methodologies. You will demonstrate multi-tasking skills under tight deadlines and constraints. You will regularly contribute to the development of work products (including analyzing, designing, programming, debugging, and documenting software) and may work with customers to resolve challenges and respond to suggestions for improvements and enhancements. You will setup the standard and principal for the product he/she drives.
- Setup coding practice, guidelines & quality of the software delivered.
- Determines operational feasibility by evaluating analysis, problem definition, requirements, solution development, and proposed solutions.
- Documents and demonstrates solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code.
- Prepares and installs solutions by determining and designing system specifications, standards, and programming.
- Improves operations by conducting systems analysis; recommending changes in policies and procedures.
- Updates job knowledge by studying state-of-the-art development tools, programming techniques, and computing equipment; participating in educational opportunities; reading professional publications; maintaining personal networks; participating in professional organizations.
- Protects operations by keeping information confidential.
- Develops software solutions by studying information needs; conferring with users; studying systems flow, data usage, and work processes; investigating problem areas; following the software development lifecycle. Who are you? You are a go-getter, with an eye for detail, strong problem-solving and debugging skills, and having a degree in BE/MCA/M.E./ M Tech degree or equivalent degree from reputed college/university.
Essential Skills / Experience:
- 10+ years of engineering experience
- Experience in designing and developing high volume web-services using API protocols and data formats
- Proficient in API modelling languages and annotation
- Proficient in Java programming
- Experience with Scala programming
- Experience with ETL systems
- Experience with Agile methodologies
- Experience with Cloud service & storage
- Proficient in Unix/Linux operating systems
- Excellent oral and written communication skills Preferred:
- Functional programming languages (Scala, etc)
- Scripting languages (bash, Perl, Python, etc)
- Amazon Web Services (Redshift, ECS etc)
Responsibilities:
- Own end to end development and operations of high-performance Spring Hibernate Applications.
- Design the architecture and deliver clean, testable and scalable code
- Participate in requirement gathering and display a strong sense of ownership and delivery
Skills and Qualifications:
- Strong in Data Structures, algorithms and Object Oriented Concepts, Message Queues and Caching
- BE/ B.Tech preferred
Job Summary
- Excellent hands-on experience with Go lang (if not Golang, in either JAVA, DotNet and/or NodeJS)
- Write CRONs and background tasks required for the growth of business and product
- Build REST APIs as required
- Ability to code using design principles
- Write reusable code and libraries for future use
- Have the working knowledge of Microservices Architecture using Docker
- Collaborate with other team members and stakeholders in executing various new and existing ideas
- Possesses the knowledge of developing and deploying in Linux environments
- Passion for building great products and loads of energy.
Key Skills
Skills that we would be more than happy for a dev to have: - Worked in CI/CD environments - Developed code using TDD/BDD approach. - Worked with Virtualization on Linux (KVM) - Experience in working in Agile development environment
About You
We’re looking for exceptional Engineers with an amazing breadth and depth of technology expertise! If you’re the kind of person that looks at the bigger picture and want to build something that has a real impact on the end user, go ahead and apply for the position.
Ability to see the big picture but still love to code!
Strong in backend languages, such as Java, DotNet or NodeJS!
Familiar with client-side frameworks such as React, Angular, Vue etc.
Strong HTML/CSS skills – you understand not only how to build the data, but how to make it look great too.
Knowledge of architectural design and you like to build something scalable and flexible to support business
Agile or Scrum is your favorite development approach.
And when we start talking about performance, security and unit testing? Well that’s music to your ears