Similar jobs
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
We are seeking a talented React Developer with hands-on experience in MUI and Tailwind CSS, as well as expertise in state management tools such as Saga and Redux. The ideal candidate should have a passion for front-end development, an eye for design, and be comfortable working in a fastpaced environment. You will be responsible for creating user interface components using React.js, React Native integrating with RESTful APIs, and collaborating with cross-functional teams to build web applications.
Responsibilities:
Build new user-facing features using React.js, React Native and other front-end technologies
. Develop reusable components and libraries for future use.
Collaborate with the development team to design and implement RESTful APIs.
Translate designs and wireframes into high-quality code.
Create responsive HTML designs based on wireframes built on PSD, Figma, and ETC. to work across mobile and web browsers.
Develop and implement front-end architectures and design patterns. Stay up-to-date with emerging trends and technologies in front-end development.
Write unit tests and integration tests to ensure code quality.
Ensure code follows best practices and coding standards.
Work collaboratively with UX designers and product owners to ensure a seamless user experience
GUVI is looking for Instructors!! (Freelancers)
(Godot) for E-Learning Courses.(self paced courses)
Company: Guvi, Incubated by IIT Madras and IIM Ahmedabad.
Location: Remote
About the Company:
GUVI is an e-learning platform dedicated to providing high-quality technical education to students and professionals. We offer industry-relevant courses in a convenient and engaging format, with the support of prestigious institutions like IIT Madras and IIM Ahmedabad.
Job Description:
We are looking for experienced individuals in Godot to create and deliver top-notch e-learning courses. Your responsibilities will include developing course syllabi, designing instructional materials, and recording engaging teaching videos.
Requirements:
- Expertise in Godot.
- Strong command of the subject matter and fundamental concepts.
- Excellent communication skills and the ability to explain complex topics clearly.
- Familiarity with instructional design principles and creating engaging learning experiences.
- Self-motivated and able to work independently.
- Proficient in creating high-quality recorded video lessons.
- Familiarity with online teaching platforms and technologies.
Project Duration: 2 to 4 weeks (can be extended to 5 weeks)
Course : Godot( Beginner to advanced)
Course duration : 12 to 13 hours
Per video duration : 15 to 20 minutes
Compensation for the project : 6K to 8k (per finished hour )
Should have interest in Business development (B2C/ B2B) for Software Application
products/solutions.
Meet prospective clients, execute effective product demonstrations, emphasising product features
and benefits with focus on the value of the solution.
Meet with Committee Members of Apartments and societies to understand scope of business and
their expectations.
Prospect, educate, qualify, and generate interest for Sales Opportunities.
On-boarding new apartments on NoBrokerHood platform will be a major KRA for a BD.
Researching potential leads from open market, web searches, or digital resources.
Hiring Software developers to build the next-gen Wealth Management Platform.
We’re looking for a Senior Platform Engineer who is passionate about the development and implementation of distributed cloud-based SaaS Platforms on B2C and B2B2C models.
You must be self-motivated and ready to achieve the goals assigned working with everyone. ● Knowledge in the successful development and implementation of large cloud-based SaaS platforms on B2C and B2B2C models.
● Hands-on with Modern cloud platforms like AWS, GCP, Azure, etc.
● Strong aptitude for Serverless Technologies on the cloud such as Lambdas, Streaming based architectures.
● Solid understanding of Open REST APIs-based architectures, web-based platforms strategy, working with product, UI, back-end engineering teams.
● Endorsestest-driven development, best deployment practices, continuous integration, and continuous build, handling git repositories.
● Hands-on Python, Java, Multi database platforms such as Relational and NoSQL
● Knack for microservices-based architectures.
● Must have an architectural sense of building highly scalable platforms on the cloud
● Experience with building asynchronous, message-driven platforms would be a plus.
● Interest and Experience in Capital Marketsis a plus.
DATAKYND seeks a Python Developer who can collaborate with our front end developers and designers to meet multiple project needs for our clients. We seek team player that can think beyond the code to provide recommendations and solutions focused on meeting client’s needs.
Responsibilities:
- Full-stack development
- Develop data migration, conversion, cleansing, retrieval tools and processes (ETL) using pandas
- Web Automation, Web crawlers and scrapers
- Data import/export formats for third-party applications (JSON/CSV)
- Integrations with third-party applications (REST API)
- Requirements analysis and providing solutions using Python and related tools.
- Supporting and maintaining existing Python scripts, applications and interfaces.
- Evaluating emerging open-source libraries and providing recommendations.
- Strong analytical and problem-solving skills is necessary
Primary Technical and Functional Skills:
- Python 3.x , Web frameworks (Django, Flask)
- JS, HTML, JavaScript, CSS, Bootstrap
- PostgreSQL, MongoDB, SQL
- Multithreading, Logging, Email, Schedulers
- Third-party integration, Rest APIs and microservice
- JSON/CSV
Secondary Technical and Functional Skills:
- Basics of Linux, Nginx, Gunicorn, Apache, Wsgi
- Github , Docker
- Cloud platforms such as GCP, Azure or AWS
- Design and workflow document preparation
Desired profile:
- Should have excellent verbal and written communication skill
Job Summary :
We are hiring a passionate Angular Front-End Developer to join our team at the earliest.
- You will be responsible for delivering a streamlined experience for our users.
- You will be a good fit if you like building top-notch codebase using the best practices of Angular.
Know your work :
- Create and deploy the front-end application.
- Optimize for high-performance on both mobile and web.
- A passionate front-end developer who can combine the latest open source technologies to provide the best -visualization reports using modeling tools.
- Writing and deploying clean and documented codes on Angular v2 and above, JavaScript, HTML5, and CSS3.
- Coordinate between graphic designers and HTML developers.
- Optimize application for maximum speed and scalability.
What you need to apply:
- Working experience on HTML5/CSS3/JS/Bootstraps/Angular v2 and above is a must.
- Minimum 2-year work experience in Angular v2 and above.
- Strong knowledge of Angular 8 and Ivy Engine.
- Aware of any Object-Oriented JavaScript Frameworks (Backbone, Ember).
- Working experience of Node.js would be a plus.
- Working experience in any NoSQL database would be a plus.
- Experience in working out with RESTful APIs would be a plus.
- Aware of Firebase APIs.
- Previous experience on messaging queues - Redis/RabbitMQ/Kafka would be plus.
Good to have additional skills :
- Previous experience with elastic search.
- An interest in ML with experience in the same(though totally optional) is highly appreciated.
- Working knowledge of Docker/Ansible.
What you get :
- Amazing workplace and colleagues in the IT corridor of Bangalore.
- Competitive salary at par with the best in the industry.
- Immense exposure to new technologies.
Notice Period: Immediately or within two weeks
Location: Bengaluru/Bangalore, Pune
#angularfrontenddeveloper #SeniorAngularDeveloper #AngularIVYDeveloper #AngularIvy
Location: Cochin-Chittethukara
Exp: 3 to 7 yrs Skills: Dot Net, C#, MVC Web API, WCF, Webservices
- Can work independently on the Android Development platform
- Must have knowledge of both Java and Kotlin
- Good understanding of Architecture such as MVVM and MVP.
- Must have at least 3 Good quality Android apps in the portfolio to showcase