- Working knowledge of setting up and running HD insight applications
- Hands on experience in Spark, Scala & Hive
- Hands on experience in ADF – Azure Data Factory
- Hands on experience in Big Data & Hadoop ECO Systems
- Exposure to Azure Service categories like PaaS components and IaaS subscriptions
- Ability to Design, Develop ingestion & processing frame work for ETL applications
- Hands on experience in powershell scripting, deployment on Azure
- Experience in performance tuning and memory configuration
- Should be adaptable to learn & work on new technologies
- Should have Communication Good written and spoken
About MNC
Similar jobs
What you will need:
Location : Bangalore
Total No. of Positions : 10+
Exp Band 1: 5 to 8 years
Location: Bangalore (Work from Office)
Job Description :
Experience in c++ linux environment and JD as below
Requirements analysis (DOORs, Jama or similar tools preferred)
Design Methodologies (Object Oriented, UML, Iterative, Agile preferred)
C++ programming, C, shell scripting, networking protocols, audio & video distribution, and
database management
Software Configuration Management – experience with GIT (Stash/BitBucket) preferred
System Integration & Test – problem solving and defect resolution
Experience with Object Oriented Programming concepts and design patterns, and abstraction
methodology
Experience in software design and development of Linux-based systems following an industrial
process
Experience with testing frameworks (i.e. google test), mocking frameworks (i.e. google mock),
static analysis tools (i.e. CppCheck), and test driven development.
Good To Know:
Object Oriented Programming.
Database Programming in any DB. (preferably Oracle).
Socket Programming.
MultiThreading. (POSIX THREADS)
Data Structures, STL
C++ Unit Testing.
Good debugging skills.
Hi,
We are looking for cloud solution professionals with the following skill sets;
Experience: 10+ years in cloud architecting
Location: Mumbai
Job Responsibilities:
- Analyze and understand customer business processes and workflows, define requirements and design appropriate solutions.
- Provide End 2 end cloud Solutioning along with secured infra
- Collaborate with vendors for the execution
- Well understanding on open source stack frameworks, AWS & Azure Cloud services
- Solutioning extending from green field to enterprise view
- Presentation skills with a high degree of comfort with both large and small audiences.
- High level of comfort communicating effectively across internal and external organizations
- Intermediate/advanced knowledge of the cloud services, market segments, customer base and industry verticals.
- Demonstrated experience leading or developing high quality, enterprise scale software products using a structured system development life cycle.
- Demonstrated ability to adapt to new technologies and learn quickly.
- Certified Solutions Architect( AWS / Azure)
- Recommendations on security, cost, performance, reliability and operational efficiency to accelerate challenging, mission-critical projects
- Experience migrating or transforming customer solutions to the cloud
Primary Skills :
JAVA / J2EE; Spring, Spring Boot, Microservices,Angular JS, Instream data handling, Elastics search DB, Mango DB,DevOps tools- Jenkin, github,maven build, Hands on AWS & Azure cloud services,Mobile: Native and hybrid app hands on;Docker Containers , AKS,Big data and Hbase, Data Lake , service bus, AD
Secondary Skills :
- Extensive experience in Microservices, Rest Services, JPA, Automated unit testing through tools.
- Proven design skills and expertise is required.
- Good knowledge of current / emerging technologies and trends.
- Good analytical, grasping and problem solving skills. Excellent written and verbal communication skills. High levels of initiative and creativity.
- Good communication skills with all stake holders, good team player with ability to mentor juniors
● Create and maintain optimal data pipeline architecture.
● Assemble large, complex data sets that meet functional / non-functional
business requirements.
● Building and optimizing ‘big data’ data pipelines, architectures and data sets.
● Maintain, organize & automate data processes for various use cases.
● Identifying trends, doing follow-up analysis, preparing visualizations.
● Creating daily, weekly and monthly reports of product KPIs.
● Create informative, actionable and repeatable reporting that highlights
relevant business trends and opportunities for improvement.
Required Skills And Experience:
● 2-5 years of work experience in data analytics- including analyzing large data sets.
● BTech in Mathematics/Computer Science
● Strong analytical, quantitative and data interpretation skills.
● Hands-on experience with Python, Apache Spark, Hadoop, NoSQL
databases(MongoDB preferred), Linux is a must.
● Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
● Experience with Google Cloud Data Analytics Products such as BigQuery, Dataflow, Dataproc etc. (or similar cloud-based platforms).
● Experience working within a Linux computing environment, and use of
command-line tools including knowledge of shell/Python scripting for
automating common tasks.
● Previous experience working at startups and/or in fast-paced environments.
● Previous experience as a data engineer or in a similar role.
Hello,
Greetings for the day !!!
Tridat Technologies is hiring "L1 Windows Server Administrator" for one of the advanced technology solutions company catering to the needs of the Banking, Mobility, Payments and Government sectors.
Qualifications: Any graduate
Experience: 2+ yrs
Roles & Responsibilities:
• Windows /Linux OS Administration ( Certification will be added value).
• Trouble shooting knowledge
• AD knowledge.
• Cloud (AWS / Azure /GCP) knowledge ( Certification will be added value).
• Good communication skill
• Team work
• Remote management idea
• 24x7 Support
Experience in Virtualisation (Vmware & hYperv)
• Tickets and ITSM process idea.
Location: Rabale, Navi Mumbai
Working Timing: 24*7 rotational shifts
Employment Mode: Contract to hire (Full time opportunity)
Joining Period: Immediate to max 15 days
Thank You & Regards,
Shraddha Kamble
HR Recruiter
Roles and Responsibilities:
- To Maintain the required uptime for Azure Cloud and IT Infrastructure.
- To provide earliest resolution of the reported issues, which may include but not limited to cloud &
- end user related issues.
- Configuring and managing the alerts through Nagios, which may include but not limited to
- scripting knowledge.
- Linux and Windows Server administration.
- Managing Firewalls and Domain Controllers.
- Timely delivery of the assigned tasks.
Requirements:
- Having relevant experience of 3-8 yrs.
- Sound knowledge of Linux & Windows System administration
- Good Hands-on experience on Cloud-AWS/Azure ( Willing to work on Azure Cloud )
- Good knowledge of networking, firewall & domain controller
- Basic knowledge of DevOps/Scripting
- Owning accountability and responsibility for end-to-end tasks
- We are looking for a Data Engineer to build the next-generation mobile applications for our world-class fintech product.
- The candidate will be responsible for expanding and optimising our data and data pipeline architecture, as well as optimising data flow and collection for cross-functional teams.
- The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimising data systems and building them from the ground up.
- Looking for a person with a strong ability to analyse and provide valuable insights to the product and business team to solve daily business problems.
- You should be able to work in a high-volume environment, have outstanding planning and organisational skills.
Qualifications for Data Engineer
- Working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimising ‘big data’ data pipelines, architectures, and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- Looking for a candidate with 2-3 years of experience in a Data Engineer role, who is a CS graduate or has an equivalent experience.
What we're looking for?
- Experience with big data tools: Hadoop, Spark, Kafka and other alternate tools.
- Experience with relational SQL and NoSQL databases, including MySql/Postgres and Mongodb.
- Experience with data pipeline and workflow management tools: Luigi, Airflow.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift.
- Experience with stream-processing systems: Storm, Spark-Streaming.
- Experience with object-oriented/object function scripting languages: Python, Java, Scala.
As a Partner Development Solution Architect focused on GSI partners within Aqua Security, you will have the opportunity to deliver on a strategy to build mind share and broad use of Aqua Platform across the partner community. Your broad responsibilities will include: owning the technical engagement with strategic partners, position aqua to be part of partner offerings, and assist with the creation of new technical strategies to help partners build and increase their application security practice business. You will be responsible for providing subject-matter expertise on the security of running cloud native workloads, which are rapidly being adopted in enterprise deployments. You will also drive technical relationships with all stakeholders and support sales opportunities. You will also work closely with the internal sales and partner sales team throughout the sales process to ensure all of the partners’ technical needs are understood and met with the best possible solution.
Responsibilities:
The ideal person will have excellent communications skills and be able to translate technical requirements for a non-technical audience. This person can multi-task, is self-motivated, while still interacting well with a team; is highly organized with high energy level and can-do attitude. Required skills include:
- Experience as a sales engineer or solution architect, working with enterprise software products or services.
- Ability to assess partner and customer requirements, identify business problems, and demonstrate proposed solutions.
- Ability to present at technical meetups.
- Ability to work with partners and conduct technical workshops
- Recent familiarity or hands-on experience with:
- Linux distributions, Windows Server
- Networking configurations, routing, firewalling
- DevOps eco-system: CI/CD tools, datacenter automation, open source tools like Jenkins
- Cloud computing environments (AWS, Azure, and Google Compute)
- Container technologies like Docker, Kubernetes, OpenShift and Mesos
-Knowledge of general security practices & DevSecOps
- Up to 25% travel is expected. The ideal candidate will be located in Hyderabad, India
Requirements:
- 7+ years of hands on implementation or consulting experience
- 3+ years in a customer and or partner facing roles
- Experience working with end users or developer communities
- Experience working effectively across internal and external organizations
- Knowledge of the software development lifecycle
- Strong verbal and written communications
- BS degree or equivalent experience required
Good understating or hand's on in Kafka Admin / Apache Kafka Streaming.
Implementing, managing, and administering the overall hadoop infrastructure.
Takes care of the day-to-day running of Hadoop clusters
A hadoop administrator will have to work closely with the database team, network team, BI team, and application teams to make sure that all the big data applications are highly available and performing as expected.
If working with open source Apache Distribution, then hadoop admins have to manually setup all the configurations- Core-Site, HDFS-Site, YARN-Site and Map Red-Site. However, when working with popular hadoop distribution like Hortonworks, Cloudera or MapR the configuration files are setup on startup and the hadoop admin need not configure them manually.
Hadoop admin is responsible for capacity planning and estimating the requirements for lowering or increasing the capacity of the hadoop cluster.
Hadoop admin is also responsible for deciding the size of the hadoop cluster based on the data to be stored in HDFS.
Ensure that the hadoop cluster is up and running all the time.
Monitoring the cluster connectivity and performance.
Manage and review Hadoop log files.
Backup and recovery tasks
Resource and security management
Troubleshooting application errors and ensuring that they do not occur again.