24+ Apache HBase Jobs in India
Apply to 24+ Apache HBase Jobs on CutShort.io. Find your next job, effortlessly. Browse Apache HBase Jobs and apply today!
Sigmoid works with a variety of clients from start-ups to fortune 500 companies. We are looking for a detailed oriented self-starter to assist our engineering and analytics teams in various roles as a Software Development Engineer.
This position will be a part of a growing team working towards building world class large scale Big Data architectures. This individual should have a sound understanding of programming principles, experience in programming in Java, Python or similar languages and can expect to
spend a majority of their time coding.
Location - Bengaluru and Hyderabad
Responsibilities:
● Good development practices
○ Hands on coder with good experience in programming languages like Java or
Python.
○ Hands-on experience on the Big Data stack like PySpark, Hbase, Hadoop, Mapreduce and ElasticSearch.
○ Good understanding of programming principles and development practices like checkin policy, unit testing, code deployment
○ Self starter to be able to grasp new concepts and technology and translate them into large scale engineering developments
○ Excellent experience in Application development and support, integration development and data management.
● Align Sigmoid with key Client initiatives
○ Interface daily with customers across leading Fortune 500 companies to understand strategic requirements
● Stay up-to-date on the latest technology to ensure the greatest ROI for customer &Sigmoid
○ Hands on coder with good understanding on enterprise level code
○ Design and implement APIs, abstractions and integration patterns to solve challenging distributed computing problems
○ Experience in defining technical requirements, data extraction, data
transformation, automating jobs, productionizing jobs, and exploring new big data technologies within a Parallel Processing environment
● Culture
○ Must be a strategic thinker with the ability to think unconventional /
out:of:box.
○ Analytical and data driven orientation.
○ Raw intellect, talent and energy are critical.
○ Entrepreneurial and Agile : understands the demands of a private, high growth company.
○ Ability to be both a leader and hands on "doer".
Qualifications: -
- Years of track record of relevant work experience and a computer Science or related technical discipline is required
- Experience with functional and object-oriented programming, Java must.
- hand-On knowledge in Map Reduce, Hadoop, PySpark, Hbase and ElasticSearch.
- Effective communication skills (both written and verbal)
- Ability to collaborate with a diverse set of engineers, data scientists and product managers
- Comfort in a fast-paced start-up environment
Preferred Qualification:
- Technical knowledge in Map Reduce, Hadoop & GCS Stack a plus.
- Experience in agile methodology
- Experience with database modeling and development, data mining and warehousing.
- Experience in architecture and delivery of Enterprise scale applications and capable in developing framework, design patterns etc. Should be able to understand and tackle technical challenges, propose comprehensive solutions and guide junior staff
- Experience working with large, complex data sets from a variety of sources
Lead Data Engineer
Data Engineers develop modern data architecture approaches to meet key business objectives and provide end-to-end data solutions. You might spend a few weeks with a new client on a deep technical review or a complete organizational review, helping them to understand the potential that data brings to solve their most pressing problems. On other projects, you might be acting as the architect, leading the design of technical solutions, or perhaps overseeing a program inception to build a new product. It could also be a software delivery project where you're equally happy coding and tech-leading the team to implement the solution.
Job responsibilities
· You might spend a few weeks with a new client on a deep technical review or a complete organizational review, helping them to understand the potential that data brings to solve their most pressing problems
· You will partner with teammates to create complex data processing pipelines in order to solve our clients' most ambitious challenges
· You will collaborate with Data Scientists in order to design scalable implementations of their models
· You will pair to write clean and iterative code based on TDD
· Leverage various continuous delivery practices to deploy, support and operate data pipelines
· Advise and educate clients on how to use different distributed storage and computing technologies from the plethora of options available
· Develop and operate modern data architecture approaches to meet key business objectives and provide end-to-end data solutions
· Create data models and speak to the tradeoffs of different modeling approaches
· On other projects, you might be acting as the architect, leading the design of technical solutions, or perhaps overseeing a program inception to build a new product
· Seamlessly incorporate data quality into your day-to-day work as well as into the delivery process
· Assure effective collaboration between Thoughtworks' and the client's teams, encouraging open communication and advocating for shared outcomes
Job qualifications Technical skills
· You are equally happy coding and leading a team to implement a solution
· You have a track record of innovation and expertise in Data Engineering
· You're passionate about craftsmanship and have applied your expertise across a range of industries and organizations
· You have a deep understanding of data modelling and experience with data engineering tools and platforms such as Kafka, Spark, and Hadoop
· You have built large-scale data pipelines and data-centric applications using any of the distributed storage platforms such as HDFS, S3, NoSQL databases (Hbase, Cassandra, etc.) and any of the distributed processing platforms like Hadoop, Spark, Hive, Oozie, and Airflow in a production setting
· Hands on experience in MapR, Cloudera, Hortonworks and/or cloud (AWS EMR, Azure HDInsights, Qubole etc.) based Hadoop distributions
· You are comfortable taking data-driven approaches and applying data security strategy to solve business problems
· You're genuinely excited about data infrastructure and operations with a familiarity working in cloud environments
· Working with data excites you: you have created Big data architecture, you can build and operate data pipelines, and maintain data storage, all within distributed systems
Professional skills
· Advocate your data engineering expertise to the broader tech community outside of Thoughtworks, speaking at conferences and acting as a mentor for more junior-level data engineers
· You're resilient and flexible in ambiguous situations and enjoy solving problems from technical and business perspectives
· An interest in coaching others, sharing your experience and knowledge with teammates
· You enjoy influencing others and always advocate for technical excellence while being open to change when needed
Requirements
• Extensive and expert programming experience in at least one general programming language (e. g.
Java, C, C++) & tech stack to write maintainable, scalable, unit-tested code.
• Experience with multi-threading and concurrency programming.
• Extensive experience in object oriented design skills, knowledge of design patterns, and a huge passion
and ability to design intuitive modules and class-level interfaces.
• Excellent coding skills - should be able to convert design into code fluently.
• Knowledge of Test Driven Development.
• Good understanding of databases (e. g. MySQL) and NoSQL (e. g. HBase, Elasticsearch, Aerospike etc).
• Strong desire to solve complex and interesting real world problems.
• Experience with full life cycle development in any programming language on a Linux platform.
• Go-getter attitude that reflects in energy and intent behind assigned tasks.
• Worked in a startup-like environment with high levels of ownership and commitment.
• BTech, MTech or Ph. D. in Computer Science or related technical discipline (or equivalent).
• Experience in building highly scalable business applications, which involve implementing large complex
business flows and dealing with huge amounts of data.
• 3+ years of experience in the art of writing code and solving problems on a large scale.
• Open communicator who shares thoughts and opinions frequently, listens intently, and takes
constructive feedback.
Requirements
- Extensive and expert programming experience in at least one general programming language (e. g. Java, C, C++) & tech stack to write maintainable, scalable, unit-tested code.
- Experience with multi-threading and concurrency programming.
- Extensive experience in object-oriented design skills, knowledge of design patterns, and a huge passion and ability to design intuitive modules and class-level interfaces.
- Excellent coding skills - should be able to convert the design into code fluently.
- Knowledge of Test Driven Development. Good understanding of databases (e. g. MySQL) and NoSQL (e. g. HBase, Elasticsearch, Aerospike etc).
- Strong desire to solve complex and interesting real-world problems.
- Experience with full life cycle development in any programming language on a Linux platform. Go-getter attitude that reflects in energy and intent behind assigned tasks.
- Worked in a startup-like environment with high levels of ownership and commitment.
- BTech, MTech or Ph. D. in Computer Science or related technical discipline (or equivalent).
- Experience in building highly scalable business applications, which involve implementing large complex business flows and dealing with huge amounts of data.
- 3+ years of experience in the art of writing code and solving problems on a large scale.
- An open communicator who shares thoughts and opinions frequently listens intently and takes constructive feedback
at Play Games24x7
• B. E. /B. Tech. in Computer Science or MCA from a reputed university.
• 3.5 plus years of experience in software development, with emphasis on JAVA/J2EE Server side
programming.
• Hands on experience in core Java, multithreading, RMI, socket programing, JDBC, NIO, webservices
and design patterns.
• Knowledge of distributed system, distributed caching, messaging frameworks, ESB etc.
• Experience in Linux operating system and PostgreSQL/MySQL/MongoDB/Cassandra database.
• Additionally, knowledge of HBase, Hadoop and Hive is desirable.
• Familiarity with message queue systems and AMQP and Kafka is desirable.
• Experience as a participant in agile methodologies.
• Excellent written and verbal communication skills and presentation skills.
• This is not a fullstack requirement, we are looking for a purely backend expert.
What you'll do:
Design and development of scalable applications.
Collaborate with tech leads to get maximum understanding of underlying infrastructure.
Contribute to continual improvement by suggesting improvements to the software system.
Ensure high scalability and performance
You will advocate for good, clean, well documented and performing code; follow standards and best practices.
We'd love for you to have:
Education: Bachelor/Master Degree in Computer Science
Experience: 1-3 years of relevant experience in BI/Big-Data with hands-on coding experience
Mandatory Skills
Strong in problem-solving
Good exposure to Big Data technologies, Hive, Hadoop, Impala, Hbase, Kafka, Spark
Strong experience of Data Engineering
Able to comprehend challenges related to Database and Data Warehousing technologies and ability to understand complex design, system architecture
Experience with the software development lifecycle, design, develop, review, debug, document, and deliver (especially in a multi-location organization)
Working knowledge of Java, python
Desired Skills
Experience with reporting tools like Tableau, QlikView
Awareness of CI-CD pipeline
Inclination to work on cloud platform ex:- AWS
Crisp communication skills with team members, Business owners.
Be able to work in a challenging, dynamic environment and meet tight deadlines
Location: Bangalore/Pune/Hyderabad/Nagpur
4-5 years of overall experience in software development.
- Experience on Hadoop (Apache/Cloudera/Hortonworks) and/or other Map Reduce Platforms
- Experience on Hive, Pig, Sqoop, Flume and/or Mahout
- Experience on NO-SQL – HBase, Cassandra, MongoDB
- Hands on experience with Spark development, Knowledge of Storm, Kafka, Scala
- Good knowledge of Java
- Good background of Configuration Management/Ticketing systems like Maven/Ant/JIRA etc.
- Knowledge around any Data Integration and/or EDW tools is plus
- Good to have knowledge of using Python/Perl/Shell
Please note - Hbase hive and spark are must.
- 3+ years of SDE work experience from Product based companies
- Experience in Java, Spring Boot, MySQL, Kafka, Hbase, AWS
- Experience in Multi threading, distributed systems, Best practices of coding, scaling
Senior SRE - Acceldata (IC3 Level)
About the Job
You will join a team of highly skilled engineers who are responsible for delivering Acceldata’s support services. Our Site Reliability Engineers are trained to be active listeners and demonstrate empathy when customers encounter product issues. In our fun and collaborative environment Site Reliability Engineers develop strong business, interpersonal and technical skills to deliver high-quality service to our valued customers.
When you arrive for your first day, we’ll want you to have:
- Solid skills in troubleshooting to repair failed products or processes on a machine or a system using a logical, systematic search for the source of a problem in order to solve it, and make the product or process operational again
- A strong ability to understand the feelings of our customers as we empathize with them on the issue at hand
- A strong desire to increase your product and technology skillset; increase- your confidence supporting our products so you can help our customers succeed
In this position you will…
- Provide Support Services to our Gold & Enterprise customers using our flagship Acceldata Pulse,Flow & Torch Product suits. This may include assistance provided during the engineering and operations of distributed systems as well as responses for mission-critical systems and production customers.
- Demonstrate the ability to actively listen to customers and show empathy to the customer’s business impact when they experience issues with our products
- Participate in the queue management and coordination process by owning customer escalations, managing the unassigned queue.
- Be involved with and work on other support related activities - Performing POC & assisting Onboarding deployments of Acceldata & Hadoop distribution products.
- Triage, diagnose and escalate customer inquiries when applicable during their engineering and operations efforts.
- Collaborate and share solutions with both customers and the Internal team.
- Investigate product related issues both for particular customers and for common trends that may arise
- Study and understand critical system components and large cluster operations
- Differentiate between issues that arise in operations, user code, or product
- Coordinate enhancement and feature requests with product management and Acceldata engineering team.
- Flexible in working in Shifts.
- Participate in a Rotational weekend on-call roster for critical support needs.
- Participate as a designated or dedicated engineer for specific customers. Aspects of this engagement translates to building long term successful relationships with customers, leading weekly status calls, and occasional visits to customer sites
In this position, you should have…
- A strong desire and aptitude to become a well-rounded support professional. Acceldata Support considers the service we deliver as our core product.
- A positive attitude towards feedback and continual improvement
- A willingness to give direct feedback to and partner with management to improve team operations
- A tenacity to bring calm and order to the often stressful situations of customer cases
- A mental capability to multi-task across many customer situations simultaneously
- Bachelor degree in Computer Science or Engineering or equivalent experience. Master’s degree is a plus
- At least 2+ years of experience with at least one of the following cloud platforms: Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform (GCP), experience with managing and supporting a cloud infrastructure on any of the 3 platforms. Also knowledge on Kubernetes, Docker is a must.
- Strong troubleshooting skills (in example, TCP/IP, DNS, File system, Load balancing, database, Java)
- Excellent communication skills in English (written and verbal)
- Prior enterprise support experience in a technical environment strongly preferred
Strong Hands-on Experience Working With Or Supporting The Following
- 8-12 years of Experience with a highly-scalable, distributed, multi-node environment (50+ nodes)
- Hadoop operation including Zookeeper, HDFS, YARN, Hive, and related components like the Hive metastore, Cloudera Manager/Ambari, etc
- Authentication and security configuration and tuning (KNOX, LDAP, Kerberos, SSL/TLS, second priority: SSO/OAuth/OIDC, Ranger/Sentry)
- Java troubleshooting, e.g., collection and evaluation of jstacks, heap dumps
You might also have…
- Linux, NFS, Windows, including application installation, scripting, basic command line
- Docker and Kubernetes configuration and troubleshooting, including Helm charts, storage options, logging, and basic kubectl CLI
- Experience working with scripting languages (Bash, PowerShell, Python)
- Working knowledge of application, server, and network security management concepts
- Familiarity with virtual machine technologies
- Knowledge of databases like MySQL and PostgreSQL,
- Certification on any of the leading Cloud providers (AWS, Azure, GCP ) and/or Kubernetes is a big plus
The right person in this role has an opportunity to make a huge impact at Acceldata and add value to our future decisions. If this position has piqued your interest and you have what we described - we invite you to apply! An adventure in data awaits.
Learn more at https://www.acceldata.io/about-us">https://www.acceldata.io/about-us
Hiring for one of the MNC for India location
Key Responsibilities : ( Data Developer Python, Spark)
Exp : 2 to 9 Yrs
Development of data platforms, integration frameworks, processes, and code.
Develop and deliver APIs in Python or Scala for Business Intelligence applications build using a range of web languages
Develop comprehensive automated tests for features via end-to-end integration tests, performance tests, acceptance tests and unit tests.
Elaborate stories in a collaborative agile environment (SCRUM or Kanban)
Familiarity with cloud platforms like GCP, AWS or Azure.
Experience with large data volumes.
Familiarity with writing rest-based services.
Experience with distributed processing and systems
Experience with Hadoop / Spark toolsets
Experience with relational database management systems (RDBMS)
Experience with Data Flow development
Knowledge of Agile and associated development techniques including:
n
Senior Big Data Engineer
Note: Notice Period : 45 days
Banyan Data Services (BDS) is a US-based data-focused Company that specializes in comprehensive data solutions and services, headquartered in San Jose, California, USA.
We are looking for a Senior Hadoop Bigdata Engineer who has expertise in solving complex data problems across a big data platform. You will be a part of our development team based out of Bangalore. This team focuses on the most innovative and emerging data infrastructure software and services to support highly scalable and available infrastructure.
It's a once-in-a-lifetime opportunity to join our rocket ship startup run by a world-class executive team. We are looking for candidates that aspire to be a part of the cutting-edge solutions and services we offer that address next-gen data evolution challenges.
Key Qualifications
· 5+ years of experience working with Java and Spring technologies
· At least 3 years of programming experience working with Spark on big data; including experience with data profiling and building transformations
· Knowledge of microservices architecture is plus
· Experience with any NoSQL databases such as HBase, MongoDB, or Cassandra
· Experience with Kafka or any streaming tools
· Knowledge of Scala would be preferable
· Experience with agile application development
· Exposure of any Cloud Technologies including containers and Kubernetes
· Demonstrated experience of performing DevOps for platforms
· Strong Skillsets in Data Structures & Algorithm in using efficient way of code complexity
· Exposure to Graph databases
· Passion for learning new technologies and the ability to do so quickly
· A Bachelor's degree in a computer-related field or equivalent professional experience is required
Key Responsibilities
· Scope and deliver solutions with the ability to design solutions independently based on high-level architecture
· Design and develop the big data-focused micro-Services
· Involve in big data infrastructure, distributed systems, data modeling, and query processing
· Build software with cutting-edge technologies on cloud
· Willing to learn new technologies and research-orientated projects
· Proven interpersonal skills while contributing to team effort by accomplishing related results as needed
2018 Forbes Indonesia Choice Award winner and Galen Growth’s 2018 Most
Innovative HealthTech Startup in Asia. Ours is a secure health-tech platform with a mission tosimplifying access to healthcare by connecting millions of patients with licensed doctors, insurance, labs, and pharmacies in one mobile application.
Key Job Responsibilities:
- He/She is a responsive team player who can proactively contribute for building technicalstrategies for applications and systems by promoting an understanding of the technology andbusiness roadmap.
- He /she is someone who thrives in a fun, fast-paced, dynamic, startup-like environment.
- Work very closely with various business stakeholders to drive the execution of multiplebusiness plans and technologies.
- Work closely with Product, Design, and Marketing to conceive features, plan projects, andbuild roadmaps
- Prior experience with scalable Architecture managing team of minimum 5 engineers andcoaching, mentoring while maintaining a role with code development.
- Proven history of contributing to product strategy and shipping products with multi-functionalteams.
- Highly involved in recruitment while building team also leading app development for bothplatforms
- Promote and support company policies, procedures, mission, values, and standards of ethicsand integrity.
Minimum Qualification:
- Total of 10+ years experience
- Hands-on working on Java ( {Language understanding - Java 8, Lambdas, Collections,popular frameworks & libraries}, JVM, GC tuning, performance tuning)
- Worked on REST frameworks/libraries like Spring MVC, Spring Boot, Dropwizard, RESTExpress etc
- Worked on Relational data stores viz. MySQL, Oracle, or Postgres
- Worked on Non-relational data stores viz. Cassandra, HBase, Couchbase, MongoDB, etc
- Worked on caching infra viz. Redis, Memcached, Aerospike, Riak, etc
- Worked on Queueing infra viz. Kafka, RabbitMQ, ActiveMQ etc
Regards,
Volks consulting
Axtechnosoft Private Limited
Principal Software Engineer /Architect
Axtechnosoft Private Limited
Job Description
Responsibilities: -
- You would take ownership of the existing system and scale it more than 10X over the next 2 years.
- Apply best coding standards.
- You would create the infrastructure that can serve 100s of customers and millions of data requests per hour.
- Over the next year or so, you would be able to guide a team of 5 to 15 people to accomplish your goals. Mentoring this team into a world-class engineering team would be a key part of your role.
- Your earlier experience in successfully building, deploying and running complex, large scale web or data products.
- You would work hand-in-hand with the Product Management team to build engineering capabilities that align with the evolution of the product.
- Eventually work with Data science teamwork to ensure that the algorithmic intelligence that we build is plugged into the product in an expected manner.
- Overall, you would be responsible for end-to-end architecting from Engineering standpoint.
Must have: -
- Total experience of 8+ years while relevant experience of at least 2 years.
- Have built a platform that handles at least 500k to 1 million data request an hour.
- Worked on building an infrastructure that serves 200k+ customers.
- Hands on coder.
- Expert level knowledge in at least one technology stack - Python or ideally, Java. Also Angular, React, Node.js
- Expert level knowledge with Elastic Search or NoSQL technologies like MongoDB/HBase/Cassandra/Redis/Neo4j
- Experience developing web applications.
- Knowledge of multiple front-end languages and libraries (e.g. HTML/ CSS, JavaScript, XML, jQuery)
- Working knowledge of databases (e.g. MySQL, MongoDB), web servers (e.g. Apache) and UI/UX design.
- Devops experience working with AWS / Other cloud platforms.
- Strong knowledge of API’s.
- Excellent communication and teamwork skills
- Implementing Software Engineering best practices.
- Previously worked on user facing products with scale.
- Agile methodology.
- Great attention to detail.
- Organizational skills
- An analytical mind
Good to have: -
- Working knowledge of React Red.
- Open-source technology.
- Working knowledge of AI/ ML.
- Degree in Computer Science, Statistics or relevant field.
- Experience working in a start-up environment.
Key Skills
Python
Angular Javascript
Reactor & Solids Processing
Node.js
Elastic Search
NoSQL
Web Applications
Database
Web Servers
UX/UI Design
AWS Cloud
Agile Methodology
• Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc)
• should have good hands-on Spark (spark with java/PySpark)
• Hive
• must be good with SQL's(spark SQL/ HiveQL)
• Application design, software development and automated testing
Environment Experience:
• Experience with implementing integrated automated release management using tools/technologies/frameworks like Maven, Git, code/security review tools, Jenkins, Automated testing, and Junit.
• Demonstrated experience with Agile or other rapid application development methods
• Cloud development (AWS/Azure/GCP)
• Unix / Shell scripting
• Web services , open API development, and REST concepts
Hi,
We are looking for cloud solution professionals with the following skill sets;
Experience: 10+ years in cloud architecting
Location: Mumbai
Job Responsibilities:
- Analyze and understand customer business processes and workflows, define requirements and design appropriate solutions.
- Provide End 2 end cloud Solutioning along with secured infra
- Collaborate with vendors for the execution
- Well understanding on open source stack frameworks, AWS & Azure Cloud services
- Solutioning extending from green field to enterprise view
- Presentation skills with a high degree of comfort with both large and small audiences.
- High level of comfort communicating effectively across internal and external organizations
- Intermediate/advanced knowledge of the cloud services, market segments, customer base and industry verticals.
- Demonstrated experience leading or developing high quality, enterprise scale software products using a structured system development life cycle.
- Demonstrated ability to adapt to new technologies and learn quickly.
- Certified Solutions Architect( AWS / Azure)
- Recommendations on security, cost, performance, reliability and operational efficiency to accelerate challenging, mission-critical projects
- Experience migrating or transforming customer solutions to the cloud
Primary Skills :
JAVA / J2EE; Spring, Spring Boot, Microservices,Angular JS, Instream data handling, Elastics search DB, Mango DB,DevOps tools- Jenkin, github,maven build, Hands on AWS & Azure cloud services,Mobile: Native and hybrid app hands on;Docker Containers , AKS,Big data and Hbase, Data Lake , service bus, AD
Secondary Skills :
- Extensive experience in Microservices, Rest Services, JPA, Automated unit testing through tools.
- Proven design skills and expertise is required.
- Good knowledge of current / emerging technologies and trends.
- Good analytical, grasping and problem solving skills. Excellent written and verbal communication skills. High levels of initiative and creativity.
- Good communication skills with all stake holders, good team player with ability to mentor juniors
We are looking for a Senior Python Developer to produce large scale distributed software solutions. You’ll be part of a cross-functional team that’s responsible for the complete software development life cycle, from conception to deployment.
If you’re also familiar with Agile methodologies, we’d like to meet you.
Responsibilities:
Work with development teams and product managers to ideate software solutions Design client-side and server-side architecture Build the front-end of applications through appealing visual design Develop and manage well-functioning databases and applications Write effective APIs Test software to ensure responsiveness and efficiency Troubleshoot, debug and upgrade software Create security and data protection settings Write technical documentation
Requirements
Proven experience as a Python Developer or similar role Knowledge on Python, Django, MongoDB, Elasticsearch, AWS Excellent communication and teamwork skills Great attention to detail Organizational skills An analytical mind Experience on Apache Kafka, Hbase and Graph DB is an added bonus
2. Perform data migration and conversion activities.
3. Develop and integrate software applications using suitable development
methodologies and standards, applying standard architectural patterns, taking
into account critical performance characteristics and security measures.
4. Collaborate with Business Analysts, Architects and Senior Developers to
establish the physical application framework (e.g. libraries, modules, execution
environments).
5. Perform end to end automation of ETL process for various datasets that are
being ingested into the big data platform.
Mid / Senior Big Data Engineer
Job Description:
Role: Big Data EngineerNumber of open positions: 5Location: PuneAt Clairvoyant, we're building a thriving big data practice to help enterprises enable and accelerate the adoption of Big data and cloud services. In the big data space, we lead and serve as innovators, troubleshooters, and enablers. Big data practice at Clairvoyant, focuses on solving our customer's business problems by delivering products designed with best in class engineering practices and a commitment to keep the total cost of ownership to a minimum.
Must Have:
- 4-10 years of experience in software development.
- At least 2 years of relevant work experience on large scale Data applications.
- Strong coding experience in Java is mandatory
- Good aptitude, strong problem solving abilities, and analytical skills, ability to take ownership as appropriate
- Should be able to do coding, debugging, performance tuning and deploying the apps to Prod.
- Should have good working experience on
- o Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)
- o Kafka
- o J2EE Frameworks (Spring/Hibernate/REST)
- o Spark Streaming or any other streaming technology.
- Strong coding experience in Java is mandatory
- Ability to work on the sprint stories to completion along with Unit test case coverage.
- Experience working in Agile Methodology
- Excellent communication and coordination skills
- Knowledgeable (and preferred hands on) - UNIX environments, different continuous integration tools.
- Must be able to integrate quickly into the team and work independently towards team goals
- Take the complete responsibility of the sprint stories' execution
- Be accountable for the delivery of the tasks in the defined timelines with good quality.
- Follow the processes for project execution and delivery.
- Follow agile methodology
- Work with the team lead closely and contribute to the smooth delivery of the project.
- Understand/define the architecture and discuss the pros-cons of the same with the team
- Involve in the brainstorming sessions and suggest improvements in the architecture/design.
- Work with other team leads to get the architecture/design reviewed.
- Work with the clients and counter-parts (in US) of the project.
- Keep all the stakeholders updated about the project/task status/risks/issues if there are any.
Experience: 4 to 9 years
Keywords: java, scala, spark, software development, hadoop, hive
Locations: Pune