Cutshort logo
Molecular Connections logo
Sr. Spark/AWS Developer
Sr. Spark/AWS Developer
Molecular Connections's logo

Sr. Spark/AWS Developer

Molecular Connections's profile picture
Posted by Molecular Connections
8 - 10 yrs
₹15L - ₹20L / yr
Bengaluru (Bangalore)
Skills
Spark
Hadoop
Big Data
Data engineering
PySpark
Apache Kafka
skill iconScala
skill iconPython
Data modeling
  1. Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
  2. A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
  3. Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
  4. Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
  5. Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
  6. Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
  7. Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
  8. Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
  9. Exposure to Cloudera development environment and management using Cloudera Manager.
  10. Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
  11. Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
  12. Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
  13. Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
  14. Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
  15. Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
  16. Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
  17. Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
  18. In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
  19. Hands on expertise in real time analytics with Apache Spark.
  20. Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
  21. Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
  22. Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
  23. Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
  24. Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
  25. Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
  26. Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
  27. Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
  28. In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
  29. Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis. 
  30. Generated various kinds of knowledge reports using Power BI based on Business specification. 
  31. Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
  32. Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
  33. Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
  34. Good experience with use-case development, with Software methodologies like Agile and Waterfall.
  35. Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
  36. Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
  37. Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
  38. Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
  39. Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Shubham Vishwakarma's profile image

Shubham Vishwakarma

Full Stack Developer - Averlon
I had an amazing experience. It was a delight getting interviewed via Cutshort. The entire end to end process was amazing. I would like to mention Reshika, she was just amazing wrt guiding me through the process. Thank you team.
Companies hiring on Cutshort
companies logos

About Molecular Connections

Founded :
2001
Type :
Products & Services
Size :
1000-5000
Stage :
Profitable

About

With over two decades of experience in Big Data and Data Science Solutions, Molecular Connections (MC) has been using AI-powered proprietary models to help customers achieve digital transformation. We have been able to successfully build a data-driven decision-making strategy for our customer's digital transformational journey. MC leverages AI, ML and the Linked Data Store to build efficiencies in various verticals and generate new revenue streams for its customers. MC's decades of industry presence and a strong focus on innovation have led us to work with the world's leading pharma and STEM industries to offer end-to-end software development and data insights powered by proprietary workflows and platforms, enabling content engineering across multiple domains. With over 70% of its workforce being women, MC is ranked among the top 15 best companies for women to work for in India. Subsidiary Companies: 1. Molecular Connections Analytics Pvt. Ltd. | URL: https://mcanalytics.co.in/ 2. Molecular Connections Research Pvt. Ltd. | URL: https://mcresearch.co.in/
Read more

Connect with the team

Profile picture
Molecular Connections
Profile picture
Chendil Kumar
Profile picture
Gurminder kaur
Profile picture
Lokanath Khamari

Company social profiles

linkedin

Similar jobs

Certcube Labs Pvt Ltd
Ridhi jain
Posted by Ridhi jain
Delhi
0 - 2 yrs
₹1L - ₹2L / yr
Corporate Communications
Marketing & Communication

Job Description

As an IT Sales Intern, you will be an integral part of our sales team, focusing on field

sales activities. This internship is designed to provide you with hands-on experience in

IT sales, client engagement, and sales strategy. You will have the

opportunity to learn from experienced professionals and contribute to our growth in a

dynamic industry.

Responsibilities

  • Identify potential clients through research, networking, and market analysis to
  • create a pipeline of leads.
  • Develop and maintain relationships with prospective and existing clients by
  • providing information on services.
  • Develop a deep understanding of our cybersecurity services to effectively
  • communicate their value propositions to clients.
  • Prepare and deliver compelling sales presentations to showcase our IT solutions
  • and address client needs.
  • Assist in preparing sales proposals, quotes, and contracts while collaborating
  • with the sales team.
  • Perform various tasks, such as organizing sales documents, maintaining sales
  • records, and scheduling appointments.
  • Analyze sales data to identify trends, opportunities, and areas for improvement.
  • This may involve creating reports and presentations to share insights with the
  • sales team.
  • Actively participate in training sessions and workshops to enhance your
  • knowledge of IT services.

Candidate Skills

  • Strong communication and interpersonal skills.
  • Eagerness to learn and adapt in a fast-paced environment.
  • Ability to work independently and as part of a team.
  • Basic understanding of IT concepts and technology trends (preferred).
  • Previous sales or customer service experience (preferred)..

Qualifications

  • Candidate must have a degree in the field of business administration.
  • Any vendor natural business development training and certification will be preferred.

Benefits

  • Hands-on experience in the cybersecurity industry.
  • Training and mentorship from experienced professionals.
  • Opportunity to develop valuable sales and business development skills


 

Salary Package: Performance based

Job Type: Full time (In office)

Start Date: Immediate

Documents Requirement: Previous Internship Experience Letter (if any), ID Proof, Internship job offer letter (if any), Updated CV, Last qualification certificate.


Read more
Shiprocket
at Shiprocket
3 candid answers
5 recruiters
sunil kumar
Posted by sunil kumar
Gurugram
4 - 6 yrs
₹14L - ₹30L / yr
Data-flow analysis

                                               Sr. Data Engineer

 

Company Profile:

 

Bigfoot Retail Solutions [Shiprocket] is a logistics platform which connects Indian eCommerce SMBs with logistics players to enable end to end solutions.

Our innovative data backed platform drives logistics efficiency, helps reduce cost, increases sales throughput by reducing RTO and improves post order customer engagement and experience.

Our vision is to power all logistics for the direct commerce market in India

including first mile, linehaul, last mile, warehousing, cross border and O2O.

 

Position: Sr.Data Engineer

Team : Business Intelligence

Location: New Delhi

 

Job Description:

We are looking for a savvy Data Engineer to join our growing team of analytics experts. The hire will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.

 

Key Responsibilities:

  • Create and maintain optimal data pipeline architecture.
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our data separated and secure across national boundaries through multiple data centres and AWS regions.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.

 

 

Qualifications for Data Engineer

  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
  • Strong project management and organizational skills.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
  • Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
  • Experience with AWS cloud services: EC2, EMR, RDS, Redshift
  • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
  • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.

 

 

 

 

 

Read more
Amazing Enterprises
Rajarajeshwari Nagar
0 - 1 yrs
₹1L - ₹2L / yr
Communication Skills

Roles and Responsibilities

 

· Supports customers by providing helpful information,

· Overseeing the customer service process.

· Resolving customer complaints brought to your attention.

· Answering questions, and responding to complaints.

· Maintaining a pleasant work environment for your team

· Conducting quality assurance surveys with customers and providing 

feedback to the staff.

· Possessing excellent product knowledge to enhance customer suppor

· Handling Customer concerns and complaints in a timely manner

· Informing Customers of upcoming promotions, offers, updates and 

much more

· Courier Tracking, contacting courier people, give a feedback to 

customer

· Semi Voice process

Read more
Beyondsoft
at Beyondsoft
1 recruiter
Muhammad Azhar
Posted by Muhammad Azhar
Hyderabad
5 - 7 yrs
₹5L - ₹15L / yr
skill iconVue.js
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconJavascript
+4 more

 

To Develop application using new stack technologies and delivered effectively, efficiently, on-time, in-specification and in a cost-effective manner.

 

The full stack Java Developer with 5-7 years of experience and should be familiar with Java, Python, Java Script, Spring Boot, Continuous Integration, branching and merging (GIT), pair programming, code reviews, feature toggles, blue- green deployments, TDD and unit testing, agile methodologies (Scrum/XP), Design Patterns, Rest API, Good understanding of Networking and Security. Familiarity with RDBMS, preferably MariaDB, MySQL, NoSQL.

This Development Engineer role will play a hands-on role to develop quality applications within the desired timeframes.

 

  • Develop and build the product in an agile software development environment.
  • Participate in sprint planning, task breakdown, and daily stand ups
  • Work with product owners, development lead and architect to understand objectives and translate these into a system level design and implementation
  • Implement designs that meet quality standards, coding standards, and provide a rich user experience across platforms
  • Design and implement new frameworks and software that meets DBS’s standards in performance, reliability, and maintainability
  • Troubleshoot and solve production issues related to performance and reliability throughout the software stack
  • Proactively identify bottlenecks in the system and work to resolve these issues before they become a problem
  • Proficient in communication, have a customer focussed mindset
Read more
Leading Financial firm
Leading Financial firm
Agency job
via Live Connections by Deepa Ganesh
Hyderabad
10 - 12 yrs
₹20L - ₹21L / yr
Genesys
SBC
SIP Trunking
Voice Over IP (VoIP)
Audio Codes
+1 more
Professionals with 10-12 yrs of exp working in Contact Center Solutions 
Hands on exp in design , Build and Trouble Shoot genesys contact Center Solutions
Hands on exp working Audio Codes SBC 
Exp working in automation - scripting tools like Python , Jenkins, Ansible ,Perl
Exp in SIP Trunks
Exp working in Cisco CUCM, Unity Connections and Voice Gateway routers 
Read more
Samsan Technologies
at Samsan Technologies
1 recruiter
HR Varsha
Posted by HR Varsha
Pune
3 - 7 yrs
₹1L - ₹10L / yr
skill iconNodeJS (Node.js)
skill iconReact.js
skill iconAngular (2+)
skill iconAngularJS (1.x)
skill iconMongoDB
+11 more

Job Responsibilities

·        Responsibilities for this position include but are not limited to, the following.

·        Development experience 3-6 years

·        Experience working with Azure cloud-hosted web applications and technologies.

·        Design and develop back-end microservices and REST APIs for connected devices, web applications, and mobile applications.

·        Stay up to date on relevant technologies, plug into user groups, and understand trends and opportunities that ensure we are using the best techniques and tools.

  • Meeting with the software development team to define the scope and scale of software projects.
  • Designing software system architecture.
  • Completing data structures and design patterns.
  • Designing and implementing scalable web services, applications, and APIs.
  • Developing and maintaining internal software tools.
  • Writing low-level and high-level code.
  • Troubleshooting and bug fixing.
  • Identifying bottlenecks and improving software efficiency.
  • Collaborating with the design team on developing micro-services.
  • Writing technical documents.
  • Be an active professional in continuous learning resulting in enhancement in organizational objectives.
  • Provide technical support to all internal teams and customers as it relates to the product.

Requirements:

  • Bachelor’s degree in computer engineering or computer science.
  • Previous experience as a full stack engineer and IoT Products.
  • Advanced knowledge of front-end languages including HTML5, CSS, JavaScript, Angular, React.
  • Proficient in back-end languages including Nodejs and basic knowledge of Java, C#.
  • Experience with cloud computing APIs and Cloud Providers such as Azure or AWS.

·        Working knowledge of database systems (Cassandra, CosmosDB, Redis, PostgreSQL)

·        Messaging systems (RabbitMQ, MQTT, Kafka)

·        Cloud-based distributed application scaling & data processing in the cloud

·        Agile / Scrum methodology

  • Advanced troubleshooting skills.
  • Familiarity with JavaScript frameworks.
  • Good communication skills.

High-level project management skills.

Read more
TechUnity Software Systems India Pvt Ltd;
Coimbatore
3 - 5 yrs
₹3L - ₹5L / yr
skill icon.NET
MVC Framework
ASP.NET
We are looking for an experienced Senior .NET developer to oversee the development of functional .NET applications and websites. You will be acting in a managerial role, overseeing the functions of the junior .NET development staff. You will be directly involved with .NET application coding, system debugging, code reviewing, and the development of operational procedures.

Responsibilities:

  • Meeting with technology managers to determine application and website requirements.
  • Upgrading existing .NET websites and applications.
  • Analyzing system requirements and delegating development tasks.
  • Developing technical specifications.
  • Writing scalable code for .NET software applications.
  • Reviewing and debugging .NET applications.
  • Providing support for junior developers.
  • Deploying functional websites, programs, and applications.
  • Drafting software and application operating procedures.
  • Training junior staff.

Requirements:

  • Bachelor’s degree in computer science or information technology.
  • Previous experience as a .NET developer.
  • High-level managerial skills.
  • Knowledge of .NET languages including C#, Visual Basic.NET, C++/CLI, J#, and JScript.NET.
  • Proficient with front-end development languages including JavaScript, HTML5, and CSS.
  • Ability to project manage.
  • Excellent problem-solving skills.
  • Good verbal and written communication skills.

 

Read more
University Living
at University Living
1 recruiter
Sanjoly Singhal
Posted by Sanjoly Singhal
Noida, NCR (Delhi | Gurgaon | Noida)
0 - 2 yrs
₹1L - ₹3L / yr
Sales
Communication Skills
Client Servicing
English Proficiency
Inside Sales
• Connecting with students internationally assisting them with their queries • Communication with global clients & accommodation providers • Understanding key requirements of the students and providing End to End support • Building Rapport with clients through friendly, engaging communication. • Using various social media platforms for communication/networking • Lead generation initiatives to convert them to bookings • Following up with students for any assistance required as part of the association
Read more
YABX
at YABX
1 recruiter
Rajat Dayal
Posted by Rajat Dayal
NCR (Delhi | Gurgaon | Noida)
9 - 15 yrs
₹30L - ₹60L / yr
skill iconRuby
skill iconRuby on Rails (ROR)
Spring
skill iconNodeJS (Node.js)
IBM WebSphere
+4 more
Qualifications BA/BS/MS degree in Computer Science, Electrical, Mathematics or related technical field, or equivalent practical experience. 8-12 years’ of overall experience in working on Banking & Financial services domain preferably in a product development company Experience/ability to design solution architecture for Core Banking implementations is highly preferred. Should have ability to articulate and design product extensibility, customizations and integrations in-line with product architecture Software Development experience through hands on coding in a general purpose programming language. Experience in developing and deploying solutions on the cloud Capabilities Experience in designing and developing Solutions for Financial services industry Experience in NoSQL / Columnar database - Clickhouse, MongoDb, Cassandra, ElasticSearch Experience in Designing and Developing products in the Micro Services framework using Node.js, Spring Boot, Ruby on Rails Web technologies in AngularJS/ReactJS Experience in XML/XSD/XPATH/XQUERY and JSON Exposure to working with Big Data Platforms and Knowledge of multiple Big Data frameworks Basic Understanding of AI, ML and Deep Learning Technologies Expert in designing and developing Application Integration using JMS/MQ, Web Services Knowledge and understanding of Data Integration solutions. Experience in implementing Reporting tools like ELK and exposure to Data visualization Sound knowledge of J2EE application servers such as WebSphere, Weblogic, JBoss and Apache Tomcat. Ability to shape ideas and opinions through proper communications and networking with key stakeholders using ones knowledge and experience. Excellent verbal, written and telephonic communication skills Ability to build scalable solutions Understanding Application security including vulnerabilities and solutions Coding ninja. Hands on experience in Complete delivery of SAAS based Software Prior start‐up experience preferred. Self starter Balance Hacker mentality with commercial grade software Curious and passion for learning Going above and beyond to solve a problem (technical or business) Key Responsibilities Actively participate in and contribute to architectural and technical stack choices Recommend and implement technical solutions in a phased approach to introduce new product constructs to our platform Create high level and detailed design specifications Engage with Product Managers, Architects and business stakeholders to define platform requirements, testing, training and support Work with and support Product Owners to deliver world beating solutions Work with business stakeholders and partners on system design, development and execution and ensure high quality deliverables Own the technical architecture, design and delivery of the solutions Provide mentorship to junior team members Hire junior engineers
Read more
CarOK
at CarOK
1 video
3 recruiters
Jaideep Patil
Posted by Jaideep Patil
Nashik
2 - 5 yrs
₹3L - ₹5L / yr
Fulfillment
Logistics
Operations
Vendor Management
CAROK is a Pune based startup with a mission to bring trust, transparency and efficiency to the automotive care sector. CAROK has a tie-up with a large number of car service providers in Pune. Using the CAROK website, mobile app or call centre, a customer can get competitive quotes for his/her car servicing and also book a servicing appointment at their convenience. Along with this, CAROK's trained mechanics & service advisors will personally supervise the servicing of the customer's car to ensure that the service centre does a good job.
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Shubham Vishwakarma's profile image

Shubham Vishwakarma

Full Stack Developer - Averlon
I had an amazing experience. It was a delight getting interviewed via Cutshort. The entire end to end process was amazing. I would like to mention Reshika, she was just amazing wrt guiding me through the process. Thank you team.
Companies hiring on Cutshort
companies logos