<h3>Job Profile</h3> Hands-on experience working with elastic search 5.x or 2.x Hands-on experience programming in Python or Node JS Understand product requirements and map them to leverage relevant elastic search features In depth understanding of analyzers, mappers, nested queries, aggregations, synonyms, significant terms etc In depth understanding of scoring (plus function score/custom scripting) Experience in handling large indexes, sharding and maintaining production level clusters Experience working with add-on tools like Kibana, Logstash, Graph, Machine Learning etc will be an added advantage <h3>Required experience</h3> 2 to 7 years <h3>Required qualification</h3> Strong foundation in computer science, with strong competencies in data structures, algorithms, and software design. Bachelors or Masters Degree in Computer Science or Engineering.
Description Deep experience and understanding of Apache Hadoop and surrounding technologies required; Experience with Spark, Impala, Hive, Flume, Parquet and MapReduce. Strong understanding of development languages to include: Java, Python, Scala, Shell Scripting Expertise in Apache Spark 2. x framework principals and usages. Should be proficient in developing Spark Batch and Streaming job in Python, Scala or Java. Should have proven experience in performance tuning of Spark applications both from application code and configuration perspective. Should be proficient in Kafka and integration with Spark. Should be proficient in Spark SQL and data warehousing techniques using Hive. Should be very proficient in Unix shell scripting and in operating on Linux. Should have knowledge about any cloud based infrastructure. Good experience in tuning Spark applications and performance improvements. Strong understanding of data profiling concepts and ability to operationalize analyses into design and development activities Experience with best practices of software development; Version control systems, automated builds, etc. Experienced in and able to lead the following phases of the Software Development Life Cycle on any project (feasibility planning, analysis, development, integration, test and implementation) Capable of working within the team or as an individual Experience to create technical documentation
Description Does solving complex business problems and real world challenges interest you Do you enjoy seeing the impact your contributions make on a daily basis Are you passionate about using data analytics to provide game changing solutions to the Global 2000 clients Do you thrive in a dynamic work environment that constantly pushes you to be the best you can be and more Are you ready to work with smart colleagues who drive for excellence in everything they do If you possess a solutions mind set , strong technological expertise , and commitment to be part of a tremendous journey , come join our growing , global team. See what Saama can do for your career and for your journey. Impact on the business: Candidate would play a key role in delivering success by leveraging Web and Big Data technologies and tools to fulfill client s business objectives. Responsibilities: Participate in requirement gathering sessions with Business users and stakeholders to understand the business needs. Understand functional and non - functional requirements and define technical Architecture and design to cater to the same. Produce a detailed technical design document to match the solution design specifications. Review and validate effort estimates produced by development team for design and build phases. Understand and apply company s solutions / frameworks to the design when needed. Collaborate with the development team to produce a technical specification for custom development and systems integration requirements. Participate and lead , when needed , the project meetings with the customer. Collaborate with senior architects in customer organization and convince / defend design and architecture decisions for the project. Be technical mentor to the development team. Required Skills Experience in designing scalable complex distributed systems. Hands on development experience in Big Data Hadoop ecosystem & Analytics space Experience working with Cloud Storage solutions in AWS , Azure etc. MS / BS degree in Computer Science , Mathematics , Engineering or related field. 12 years of experience as a technology leader designing and developing data architecture solutions with more than 2 years specializing in big data architecture or data analytics. Experience of implementing solutions using Big data technologies - Hadoop , Map / Reduce , Pig , Hive , Spark , Storm , Impala , Oozie , Flume , ZooKeeper , Sqoop etc Good understanding of NoSQL and prior experience working with NoSQL databases Hbase , MongoDB , Cassandra , Competencies: Self - starter who gets results with minimal support and direction in a fast - paced environment. Takes initiative; challenges the status quo to drive change. Learns quickly; takes smart risks to experiment and learn. Works well with others; builds trust and maintains credibility. Identifies and confirms key requirements in dynamic environments; anticipates tasks and contingencies. Strong analytical skills; able to apply creative thinking to generate solutions for complex problems Communicates effectively; productive communication with clients and all key stakeholders (both verbal and written communication).
The hunt is for a strong Java Resources and team players with the ability to manage effective relationships with a wide range of stakeholders (customers & team members alike). Incumbent will demonstrate personal commitment and accountability to ensure standards are continuously sustained and improved both within the internal teams, and with partner organizations and suppliers. Skills : Java Experience: 7 to 9 Years Designation : Lead Engineer Location: Pune (Hinjewadi Phase -2) Position: Permanent Notice Period: 2 Months 1. Role Description: Extensive Java(1.8+) and J2EE development experience. Good knowledge of Design patterns(Creational/behavioural and architectural). In depth knowledge and experience of working with Spring(Boot, Core, MVC, Security, Batch, Cloud), Hibernate, Maven, Gradle etc. Proficient in Databases like Mysql, Oracle, Postgres, MongoDB Experience of JMS queues(ActiveMQ/RabbitMQ/Kafka) Proficient in writing unit and integration test cases. Should have working knowledge of linux in order to be able to deploy monitor and maintain a application. Should have knowledge about source control and deployment tools like GIT, Jenkins, bitbucket etc. Knowledge on micro services and full stack architectures is an additional plus. Should have knowledge in performance engineering and be able to do required optimizations. Ability to perform code reviews/ensure best practices. Alongside the candidate should posses excellent communication skills, and should be able to mentor a team when required. Those who are Interested can share their resume on email@example.com
We are young and passionate team building our own product. You will enjoy working with us.