
Similar jobs
Job Description.
1. Cloud experience (Any cloud is fine although AWS is preferred. If non-AWS cloud, then the experience should reflect familiarity with the cloud's common services)
2. Good grasp of Scripting (in Linux for sure ie bash/sh/zsh etc, Windows : nice to have)
3. Python or Java or JS basic knowledge (Python Preferred)
4. Monitoring tools
5. Alerting tools
6. Logging tools
7. CICD
8. Docker/containers/(k8s/terraform nice to have)
9. Experience working on distributed applications with multiple services
10. Incident management
11. DB experience in terms of basic queries
12. Understanding of performance analysis of applications
13. Idea about data pipelines would be nice to have
14. Snowflake querying knowledge: nice to have
The person should be able to :
Monitor system issues
Create strategies to detect and address issues
Implement automated systems to troubleshoot and resolve issues.
Write and review post-mortems
Manage infrastructure for multiple product teams
Collaborate with product engineering teams to ensure best practices are being followed.
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
We are looking for an SEO/SEM expert to manage all search engine optimization and marketing activities
You will be responsible for managing all SEO activities such as content strategy, link building and keyword strategy to increase rankings on all major search networks. You will also manage all SEM campaigns on Google, Yahoo and Bing in order to maximize ROI.
Responsibilities
- Execute tests, collect and analyse data and results, identify trends and insights in order to achieve maximum ROI in paid search campaigns
- Track, report, and analyse website analytics and PPC initiatives and campaigns
- Manage campaign expenses, staying on budget, estimating monthly costs and reconciling discrepancies.
- Optimize copy and landing pages for search engine marketing
- Perform ongoing keyword discovery, expansion and optimization
- Research and implement search engine optimization recommendations
- Research and analyse competitor advertising links
- Develop and implement link building strategy
- Work with the development team to ensure SEO best practices are properly implemented on newly developed code
- Work with editorial and marketing teams to drive SEO in content creation and content programming
- Recommend changes to website architecture, content, linking and other factors to improve SEO positions for target keywords.
Requirements and skills
- Proven SEO experience
- Proven SEM experience managing PPC campaigns across Google, Yahoo and Bing.
- Solid understanding of performance marketing, conversion, and online customer acquisition
- In-depth experience with website analytics tools (e.g, Google Analytics, Net Insight, Omniture, Web Trends)
- Experience with bid management tools (e.g., Click Equations, Marin, Kenshoo, Search Ignite)
- Working knowledge of HTML, CSS, and JavaScript development and constraints
- Knowledge of ranking factors and search engine algorithms
- Up-to-date with the latest trends and best practices in SEO and SEM
What you must know?
Hands on technical experience with architecting and building a large scale product
, He/She should have flair for technology & business, be eager to take up challenging
assignments in global setup and have relevant experience working with b2b startups in
growth stage targeting global markets based out of India.
Roles & responsibilities
• Lead a team of talented 10+ engineers (developers and QA) through all stages of product
development and delivery ( requirement gather, requirement detailing, design,
development, testing, release )
• Hire, mentor and develop engineers to create high-performing teams
• Identify, coach and retain engineering talent and strengthen software development teams
• Provide constructive feedback and mentor team members to go to next level
• Build and maintain good relationship with peers, product management, architects,
customer support, HR, talent acquisition team and other cross-functional teams
• Work closely with engineering leaders and product managers for product delivery
• Contribute in engineering management team activities like, hiring, onboarding,
performance appraisal, skill management,release planning and delivery etc.
• Provide technical leadership to software engineers to build a high quality software product
• Collaborate with Internal/external stakeholder to enhance the quality of engineering
deliverables
• Establish and promote a culture of excellence with end-to-end ownership for delivery
• Ensure complete solution design in collaboration with product management, architects
and technical leads
• Participate in the creation of engineering roadmap based on organization strategy
• Drive execution of quarterly releases and a roadmap of next year.
• Analyze customer issues, suggest and implement a practise to address and improve
customer satisfaction about the product
• Hands on coding and contribute in code reviews
• Identify,develop and improve engineering practices for development, QA, devops and agile
implementation.
• Build and monitor team performance metrics.
What will qualify you for this role?:
• 13+ years experience with 2+ years in managerial capacity leading a team of 10+
developers
• Hands on technical experience with architecting and building a large scale product
• Experience with modern devops tools and technologies
• Proactive and solutions-oriented with experience in working in ambiguity
• Excellent coding and debugging in one of the languages java/python/React
• Good understanding of distributed architecture, microservice etc
• Familiarity with cloud infrastructure. Knowledge of AWS would be an added advantage.
• Experience with setting up/tracking engineering matrices
JOB DESCRIPTION:
Role:
• Develop features for Olacabs’s iOS application for customers
• Work on bug fixing and improving application performance
• Actively participate in feature design
• Unit-test code for robustness, including edge cases, usability, and general reliability
• Take ownership of the features assigned right from estimating timelines to Production release Desired experience
• Experience working on iOS platform
Experience in developing B2C mobile applications
• Expertise in development and implementation of mobile applications with custom UI components
• Experienced in memory management and in designing high-performance apps
• Experience working with Google Maps and Social APIs
• Experience working with Objective C, Cocoa, Core frameworks and the iPhone SDK (5.0 and above).
• Experience in shipping applications through App store
• Experience with analytics tools like Google Analytics, Flurry
• Well versed with mobile UI/UX conventions
• Experience in using GITDesired Skills:
• Strong understanding of Object Oriented Programming, data structures and design patterns
• Strong in C/C++, Java programming skills
• Knowledge of software development processes & agile methodologies
• Strong problem solving and debugging skills
• Excellent English language (written & verbal) communication skills
• Good understanding of DB Design Ios-SDE3- should be strong in Cocoa touch, Swift, Objective-C, MVVM architecture.
Day/ UK/ US shift available
J. P Nagar 4th Phase
Speaking to UK/ US customers and resolving querries.
Perks we offer:
✔️ Build cutting-edge web apps used by modern ecom brands.
✔️ Small and agile remote team.
✔️ Work directly with the leadership.
✔️ 5 days work week.
What we are looking for:
2+ years of work experience with both front-end/back-end development.
- Expertise in PHP/CI (or any other modern framework) , MySQL
- Strong Proficiency with JavaScript, JQuery and HTML5 and writing cross-browser compatible code.
- Deep understanding of database, load optimization, API, caching layer, proxies, and other web services used in the system.
- Comfortable working with RESTful APIs, GraphQL experience a plus
- Knowledge of Shopify is a plus
- 2018 & 2019 batches of BE/B.Tech – CS/CE/IT/E&C/E&TC/Telecom/Communications/Electronics or MCA and other circuit branches of relevance to corporations and industry qualifying degrees.
- Candidate must have graduated the qualifying course in 2018 without any backlog.
- Candidates graduated in 2019, meeting the criteria, also stand eligible.
- Candidate should have scored a minimum of 60% in SSC/10th, HSC/12th/Diploma, and UG/PG as applicable.
- Not more than a gap year since SSC, meaning no loss of the academic year after joining a course.
- Flexible to relocate anywhere in India, and willing to work on any skill/domain/work timings.








