- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
About Molecular Connections
Similar jobs
Company Name- uFaber Edutech Pvt Ltd.
Shift Timing/ Day- 10:30AM-7:30PM/11AM-8PM
Location-Pune,Kolkata,Mumbai,Noida
Website: www.ufaber.com
Who are we-
uFaber is a well-funded Edutech startup, founded by serial entrepreneurs from IIT Bombay to change the way we learn. We sell high-quality online courses on a variety of topics, from exam preparation to certifications.
Role and Responsibilities-
- Dialing 50-80+ calls and counseling students who have enquired about the product/ services.
- Scheduling free demo lectures for the students.
- Doing post demo calls and closing sales.
- Maintaining a pipeline of all sales administration using CRM software.
- Work on targets and under pressure as this is a hardcore sales profile.
- Flexible to work on additional days and hours.
What we offer you-
Fixed Pay- 2.9 to 3LPA
Performance based incentives-2 to 3LPA
Total Pay- 4.9 to 6LPA
#javascript #javascriptdevelopers #nodejs #nodejsdeveloper #mongodb #expressjs #angularjs #angularjsdeveloper #javadeveloper #phyton #phpdeveloper #backenddevelopers #fullstackdevelopers #urgentrequirement DSS Software Solutions Sdn. Bhd.
Outstanding career growth & development opportunities
Competitive salary & work benefit package
Passionate, energetic & innovative work culture
Experience building high-performance, large-scale, distributed server applications.
Experience with any of the back-end programming languages such as JavaScript (NodeJS), Java, C#, Python, PHP, etc.
Commitment to developing clean code that is easy to maintain and enhance strong logical thinking, analytical and problem-solving skills
Candidate must possess at least Diploma/Advanced/Higher/Graduate Diploma, Bachelor's Degree/Post-Graduate Diploma/Professional Degree in Computer Science/Information Technology or equivalent.
Required language(s): English, Mandarin
Preferably Junior Executive specialized in IT/Computer - Software or equivalent.
Preferably candidate with at least 2-year experience in IT/Computer - Software or equivalent.
Ineffable Communications is a new-generation full-range marketing agency for emerging technologies, ideas, and lifestyles. Passionate about building better futures, we tailor our client's Branding, Creative, Advertising, and Marketing solutions. We are a marketing firm focusing on enhancing brand value using different marketing techniques. We have extensive experience in the field of Marketing.
Note:
- This is an on-site opportunity, kindly don't apply if you are looking for a work-from-home position.
- Lucrative incentives to be provided.
Responsibilities
- Create visually appealing designs for print and digital marketing materials that align with brand standards and project goals
- Collaborate with cross-functional teams to understand project objectives, target audience, and messaging
- Manage multiple design projects simultaneously and deliver work on time
- Provide guidance and mentorship to junior designers to help them develop their skills and knowledge
- Stay up-to-date with industry trends, new design techniques, and software to continually improve skills
- Participate in brainstorming sessions to contribute to the creative direction of the company
Requirements
- 2-4 years of experience in graphic design with a strong portfolio showcasing design skills
- Bachelor's degree in graphic design or a related field
- Proficiency in Adobe Creative Suite and other design software
- Strong understanding of branding and design strategy
- Excellent communication skills, both written and verbal
- Ability to work in a fast-paced environment and deliver high-quality work on time
- Strong attention to detail and the ability to provide and receive constructive feedback
LogiCoy Software Technologies Pvt. Ltd - Bangalore Location
Years of Experience: In an approximate range of 2 to 6 years.
Key Skills:
Java, Spring, Web services, XML, XSD, working knowledge of database and UNIX, Angular.
Good verbal and written communication.
Good team player, flexible and entrepreneurial.
Quick learner.
Candidates will get an opportunity to build skills in various technologies and domains.
Compensation would be better than industry standard.
With Kind Regards,
Vijayarani T
Assistant Manager - HR,
LogiCoy Software Technologies Pvt. Ltd, Bangalore, 560043.
In-depth knowledge of OData and SAP NetWeaver Gateway,
Excellent knowledge of RFC, BAPI, BADI, BDC, User Exits, and enhancement points,
Strong exposure to system integration via EDI/ALE/IDOC.
Experience in legacy ERP migration and data migration processes (eg. upload, validation, tools: LSMW, eCATT),
Experience in SAP version ECC 6.0, ABAP Programming, Data Dictionary, ADOBE Form, SMARTFORMS, SAPScript, Module Pool Programming, ALV, Web Dynpro and SAP Workflow,
Good understanding of FI, CO, MM, PP, SD, QM, IM and WM objects and functional areas,
Be comfortable in working under the pressure of tight project deadlines,
Strong communication with varying levels of audiences both written and verbal,
Ability to make evaluations & recommendations of alternatives (options, pros/cons & recommendations),
Perform production support activities