

- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.

About Molecular Connections
About
Connect with the team
Similar jobs


We are seeking a Junior Software Engineer (AWS, Azure, Google Cloud,Spring, Node.js, Django) to join our dynamic team. As a Junior Software Engineer will have a passion for technology, a solid understanding of software development principles, and a desire to learn and grow in a collaborative environment. You will work closely with senior engineers to develop, test, and maintain software solutions that meet the needs of our clients and internal stakeholders.
Responsibilties:
- Software Development: Write clean, efficient, and well-documented code for various software applications.
- Testing & Debugging: Assist in testing and debugging software to ensure functionality, performance, and security.
- Learning & Development: Continuously improve your technical skills by learning new programming languages, tools, and AI methodologies.
- Documentation: Assist in the documentation of software designs, technical specifications, and user manuals.
- Problem-Solving: Identify and troubleshoot software defects and performance issues.
- Customer Communication: Interact with customers to gather requirements, provide technical support, and ensure their needs are met throughout the software development lifecycle. Maintain a professional and customer-focused attitude in all communications.
Requirements:
- Education: Bachelor's degree in Computer Science, Software Engineering, or a related field.
- Programming Languages: Proficiency in at least one programming language such as Java, Python, TypeScript or JavaScript.
- Familiarity with: Git version control system, Scrum software development methodology, and basic understanding of databases and SQL.
- Problem-Solving Skills: Strong analytical and problem-solving skills with a keen attention to detail.
- Communication: Good verbal and written communication skills with the ability to work effectively in a team environment and interact with customers.
- Adaptability: Ability to learn new technologies and adapt to changing project requirements.
- Internship/Project Experience: Previous internship experience or project work related to software development is a plus.
Preferred Skills:
- Experience with cloud platforms (e.g., AWS, Azure, Google Cloud).
- Familiarity with back-end frameworks (e.g., Spring, Node.js, Django).
- Knowledge of DevOps practices and tools.
Benefits:
- Work Location: Remote
- 5 days wortking
You can apply directly through the link: https://zrec.in/F57mD?source=CareerSite
Explore our Career Page for more such jobs : careers.infraveo.com
About the company: RealmApp is a browser extension for literacy, discovery, and productivity for Web3 users. Users can choose from a range of customisable widgets suiting their stage and role in crypto.
The team comprises experienced crypto entrepreneurs, operators, and stalwarts of the industry as advisors and investors.
Min Requirements:
- Figma basics
- Passionate about UI/UX
- Understanding fundamental design concepts such as color theory, typography, and composition
Good to have:
- A general understanding of web and mobile technologies and development languages
- Fundamental knowledge of user experience research and testing methods
Application Process:
Interested candidates can apply by sharing assessments of the company website (www.realmapp.io) along with their portfolio link.


Remitbee Online Money Transfer is seeking a skilled QA Automation Engineer with experience in Selenium and Manual Testing. Individuals who apply for Remitbee careers should be passionate about tech and driven towards innovating the industry further with the Remitbee team. This position also comes with the opportunity for career growth and working hour flexibility. We look forward to reading your application.
What will you do?
- Work in an agile team of developers, QA, DevOps and founders
- On going Web, API, UI testing
- Automation testing with Selenium
- Provide accurate estimates
Skills and requirements
5+ years of experience using Selenium or similar automation too
- Able to write test cases based on business requirements
- Write automation and perform Web, API, UI, functional, regression, smoke, black box, load, performance and end-to-end testing
- Experience using Jira, Xray, testrail or any kind of similar test automation tools.
- Familiarity with network packet analysis tools, such as Wireshark.
- Problem solving skills including the ability to troubleshoot and identify problem areas throughout testing.
- Collaborate well with others. Ability to translate technical concepts into easy to understand terms.
- Verbal & written communication skills - In-depth.
- Programming experience with Java.
- Proven experience of leading team.
Competencies
- Strong communication skills
- Self-Motivated
- Willingness to learn new tools and technology and work in a collaborative environment
- Creativity, this is an opportunity to be involving in shaping the strategy of our company. Ideas and input at all levels in the business is welcome.
Type: Full time
Total Exp: 5+ years
Data Analyst
Job Description
Summary
Are you passionate about handling large & complex data problems, want to make an impact and have the desire to work on ground-breaking big data technologies? Then we are looking for you.
At Amagi, great ideas have a way of becoming great products, services, and customer experiences very quickly. Bring passion and dedication to your job and there's no telling what you could accomplish. Would you like to work in a fast-paced environment where your technical abilities will be challenged on a day-to-day basis? If so, Amagi’s Data Engineering and Business Intelligence team is looking for passionate, detail-oriented, technical savvy, energetic team members who like to think outside the box.
Amagi’s Data warehouse team deals with petabytes of data catering to a wide variety of real-time, near real-time and batch analytical solutions. These solutions are an integral part of business functions such as Sales/Revenue, Operations, Finance, Marketing and Engineering, enabling critical business decisions. Designing, developing, scaling and running these big data technologies using native technologies of AWS and GCP are a core part of our daily job.
Key Qualifications
- Experience in building highly cost optimised data analytics solutions
- Experience in designing and building dimensional data models to improve accessibility, efficiency and quality of data
- Experience (hands on) in building high quality ETL applications, data pipelines and analytics solutions ensuring data privacy and regulatory compliance.
- Experience in working with AWS or GCP
- Experience with relational and NoSQL databases
- Experience to full stack web development (Preferably Python)
- Expertise with data visualisation systems such as Tableau and Quick Sight
- Proficiency in writing advanced SQL queries with expertise in performance tuning handling large data volumes
- Familiarity with ML/AÍ technologies is a plus
- Demonstrate strong understanding of development processes and agile methodologies
- Strong analytical and communication skills. Should be self-driven, highly motivated and ability to learn quickly
Description
Data Analytics is at the core of our work, and you will have the opportunity to:
- Design Data-warehousing solutions on Amazon S3 with Athena, Redshift, GCP Bigtable etc
- Lead quick prototypes by integrating data from multiple sources
- Do advanced Business Analytics through ad-hoc SQL queries
- Work on Sales Finance reporting solutions using tableau, HTML5, React applications
We build amazing experiences and create depth in knowledge for our internal teams and our leadership. Our team is a friendly bunch of people that help each other grow and have a passion for technology, R&D, modern tools and data science.
Our work relies on deep understanding of the company needs and an ability to go through vast amounts of internal data such as sales, KPIs, forecasts, Inventory etc. One of the key expectations of this role would be to do data analytics, building data lakes, end to end reporting solutions etc. If you have a passion for cost optimised analytics and data engineering and are eager to learn advanced data analytics at a large scale, this might just be the job for you..
Education & Experience
A bachelor’s/master’s degree in Computer Science with 5 to 7 years of experience and previous experience in data engineering is a plus.


Involvement in the full software development life cycle within broadly defined parameters and providing software solutions keeping into consideration the software quality needs Design and defining the interaction between the different component pieces Write efficient code based on brief given by team lead. Fast prototyping of proof-of concept features / application based on brief. Develop and maintain new features on Java stack Own the delivery of an entire piece of a system or application Management and execution against project plans and delivery commitments Work closely with the peers and Leads to develop the best technical design and approach for new product development Build software solutions for complex problems. Compliance with build/release and configuration management process Responsibility to develop unit test cases for his/her project module. Execution of appropriate quality plans ,project plans ,test strategies and processes for development activities in concern with business and project management effort
Desired Profile:
Good understanding of Object Oriented Programming Concepts, hands-on knowledge on Java stack (Spring/Hibernate) Development across multiple browsers/platforms on Website Good Understanding of SQL/NoSQL data stores Fair Understanding of Responsive High Level Designs Work experience in product/start-up company is a plus Familiarity to MVC, SOA, RESTFull web services Work with other teams and manage time across multiple projects and tasks in a deadline driven, team environment Good to have knowledge of Javascript (AngularJS/ReactJS)/HTML/CSS/JQuery front-end code across a broad array of Interactive web Understand agile methodology and instill best practices into the proce
Devops engineer :
Roles and Responsibilities
- 3+ years of experience in Infrastructure setup on Cloud, Build/Release Engineering, Continuous Integration and Delivery, Configuration/Change Management.
- Good experience with Linux/Unix administration and moderate to significant experience administering relational databases such as PostgreSQL, etc.
- Experience with Docker and related tools (Cassandra, Rancher, Kubernetes etc.)
- Experience of working in Config management tools (Ansible, Chef, Puppet, Terraform etc.) is a plus.
- Experience with cloud technologies like Azure
- Experience with monitoring and alerting (TICK, ELK, Nagios, PagerDuty)
- Experience with distributed systems and related technologies (NSQ, RabbitMQ, SQS, etc.) is a plus
- Experience with scaling data store technologies is a plus (PostgreSQL, Scylla, Redis) is a plus
- Experience with SSH Certificate Authorities and Identity Management (Netflix BLESS) is a plus
- Experience with multi-domain SSL certs and provisioning a plus (Let's Encrypt) is a plus
- Experience with chaos or similar methodologies is a plus
Experience : 3-7 Years
Job Location : Pune/Remote (only India)
Notice Period : ASAP
Work Timings : 2.30 pm-11:30 pm IST

