
- Working knowledge of setting up and running HD insight applications
- Hands on experience in Spark, Scala & Hive
- Hands on experience in ADF – Azure Data Factory
- Hands on experience in Big Data & Hadoop ECO Systems
- Exposure to Azure Service categories like PaaS components and IaaS subscriptions
- Ability to Design, Develop ingestion & processing frame work for ETL applications
- Hands on experience in powershell scripting, deployment on Azure
- Experience in performance tuning and memory configuration
- Should be adaptable to learn & work on new technologies
- Should have Communication Good written and spoken

Similar jobs
Role- Course Information Representative
Location-: Pune
About : Organization is a premier institution dedicated to nurturing the next generation of fitness professionals. We offer specialized diploma programs in Fitness Training and Nutrition, emphasizing practical learning and industry-relevant skills. Our state-of-the-art facilities, experienced faculty, and supportive environment provide students with the ideal platform to achieve their career goals in the fitness industry.
Work Culture: At Organization, we foster a collaborative, inclusive, and
growth-oriented work culture. Our team is passionate about fitness and education, committed to making a positive impact on our students' lives. We value innovation, creativity, and a growth mind-set, offering a supportive and fulfilling work environment.
Job description:
• Handling all fresh inquiries by making outbound phone calls answering basic queries, and xing their appointments with Counsellors of Organization
• These would be all the leads who have already inquired about the courses at Organization through various media. The calls would NOT be cold calls to random data.
• Follow up calls to all these inquiries as required.
• Update the details on CRM and other systems used by the company
• Submit daily activity report to immediate superior
Desired Profile:
• Excellent communication and coordination skills, counseling skills required
• Should be target oriented, focused, proactive, and keen learner (one who reads or is constantly learning and upgrading his or her knowledge base
Experience: 1 -4 year (Preferred a call center experience)
Education: Graduate
If anyone interested apply on link:- https://tiny.cc/NGtalent
Design, implement, and improve the analytics platform
Implement and simplify self-service data query and analysis capabilities of the BI platform
Develop and improve the current BI architecture, emphasizing data security, data quality
and timeliness, scalability, and extensibility
Deploy and use various big data technologies and run pilots to design low latency
data architectures at scale
Collaborate with business analysts, data scientists, product managers, software development engineers,
and other BI teams to develop, implement, and validate KPIs, statistical analyses, data profiling, prediction,
forecasting, clustering, and machine learning algorithms
Educational
At Ganit we are building an elite team, ergo we are seeking candidates who possess the
following backgrounds:
7+ years relevant experience
Expert level skills writing and optimizing complex SQL
Knowledge of data warehousing concepts
Experience in data mining, profiling, and analysis
Experience with complex data modelling, ETL design, and using large databases
in a business environment
Proficiency with Linux command line and systems administration
Experience with languages like Python/Java/Scala
Experience with Big Data technologies such as Hive/Spark
Proven ability to develop unconventional solutions, sees opportunities to
innovate and leads the way
Good experience of working in cloud platforms like AWS, GCP & Azure. Having worked on
projects involving creation of data lake or data warehouse
Excellent verbal and written communication.
Proven interpersonal skills and ability to convey key insights from complex analyses in
summarized business terms. Ability to effectively communicate with multiple teams
Good to have
AWS/GCP/Azure Data Engineer Certification
About Tibco
Headquartered in Palo Alto, CA, TIBCO Software enables businesses to reach new heights on their path to digital distinction and innovation. From systems to devices and people, we interconnect everything, capture data in real time wherever it is, and augment the intelligence of organizations through analytical insights. Thousands of customers around the globe rely on us to build compelling experiences, energize operations, and propel innovation. Our teams flourish on new ideas and welcome individuals who thrive in transforming challenges
into opportunities. From designing and building amazing products to providing excellent service;we encourage and are shaped by bold thinkers, problem-solvers, and self-starters. We are always adapting and providing exciting opportunities for our employees to grow, learn and excel.
We value the customers and employees that define who we are; dynamic individuals willing to take the risks necessary to make big ideas come to life and who are comfortable collaborating in our creative, optimistic environment. TIBCO – we are just scratching the surface.
Who You’ll Work With
TIBCO Data Virtualization (TDV) is an enterprise data virtualization solution that orchestrates access to multiple and varied data sources, delivering data sets and IT curated data services to any analytics solution. TDV is a Java based enterprise-grade database engine supporting all phases of data virtualization development, run-time, and management. It is the trusted solution of choice for the top enterprises in verticals like finance, energy, pharmaceutical, retail, telecom
etc. Are you interested in working on leading edge technologies? Are you fascinated with Big Data,Cloud, Federation and Data Pipelines? If you have built software frameworks and have a background in Data Technologies, Application Servers, Business Intelligence etc this opportunity is for you.
Overview
TIBCO Data Virtualization team is looking for a engineer with experience in the area of SQL Data Access using JDBC, WebServices, and native client access for both relational as well as non-relational sources. You will have expertise in developing metadata layer around disparate data sources and implementing a query runtime engine for data access, including plugin management. The core responsibilities will include designing, implementing and maintaining the
subsystem that abstracts data and metadata access across different relational database flavors, BigData sources, Cloud applications, enterprise application packages like SAP R/3, SAP BW, Salesforce etc. The server is implemented by a multi-million line source base in Java, so the ability to understand and integrate with existing code is an absolute must. The core runtime is a complex multi-threaded system and the successful candidate will demonstrate complete expertise in handling features geared towards concurrent transactions in a low latency, high throughput and scalable server environment. The candidate will have the opportunity to work in a collaborative environment with leading database experts in building the most robust, scalable and high performing database server.
Job Responsibilities
• In this crucial role as a Data Source Engineer, you will:
• Drive enhancements to existing data-source layer capabilities
• Understand and interface with 3rd party JDBC drivers
• Ensure all security-related aspects of driver operation function with zero defects
• Diagnose customer issues and perform bug fixes
• Suggest and implement performance optimizations
Required Skills
• Bachelor’s degree with 3+ years of experience, or equivalent work experience.
• 3+ years programming experience
• 2+ years of Java based server side experience
• 1+ years experience with at least one of JDBC, ODBC, SOAP, REST, and OData
• 1+ years of multithreading experience
• Proficiency in both spoken and written communication in English is a must
Desired Skills
• Strong object-oriented design background
• Strong SQL & database background
• Experience developing or configuring cloud-based software
• Experience with all lifecycle aspects of enterprise software
• Experience working with large, pre-existing code bases
• Experience with enterprise security technologies
• Experience with any of the following types of data sources: Relational, Big Data, Cloud, Data
Lakes, and Enterprise Applications.
• Experience using Hive, Hadoop, Impala, Cloudera, and other Big Data technologies
● The candidate will actively seek out new sales leads and business opportunities through active networking and
sending personal, strategic, value-add emails, calls, and social messages
● Use a combination of outreach mechanisms to nurture leads (Call, Email, Marketing automation tools like
outreach, Linkedin Inmails, etc. )
● Learn, leverage, and help evolve our demand generation process.
● Generate appointments by means of proactive outbound prospecting.
● Work directly with sales and marketing to discover opportunities from leads.
● Demonstrate and teach strong selling and influencing skills
● Generate new business opportunities to fuel the sales Pipeline for our products across our market segments
Skills And Qualification
● Candidate should be good at cold calling and writing cold emails
● Strong prospecting skills and ability to develop business in new accounts
● Familiar with sourcing prospect contact information using tools like ZoomInfo, Lusha, SalesNavigator, Apollo,
Slintel, etc
● Ability to think of creative ways of prospecting to make outbound more personalized
● Relationship-building with prospects
● Strong analytics skills to identify inefficiencies in order to improve it
● Self-starter who is able to operate in a hyper-growth environment


About us:
Arista Networks was founded to pioneer and deliver software driven cloud networking solutions for large datacenter storage and computing environments. Arista's award-winning platforms, ranging in Ethernet speeds from 10 to 400 gigabits per second, redefine scalability, agility and resilience. Arista has shipped more than 20 million cloud networking ports worldwide with CloudVision and EOS, an advanced network operating system. Committed to open standards, Arista is a founding member of the 25/50GbE consortium. Arista Networks products are available worldwide directly and through partners.
About this role:
- You will be working with the WiFi team at Arista, developing cutting edge and next generation WiFi solutions in a fast-paced environment. The WiFi team is responsible for the end to end development of the Cloud managed WiFi product portfolio of Arista. This specific position is for the WiFi AccessPoint team.
- As a core member of the AccessPoint team, you will be working closely with relevant teams to understand product requirements, design the solution, build the software and deliver it for final validation and customer deployment.
- You will also keep track of new and emerging technologies and their impact on Arista products, come up with new and innovative ideas to improve and differentiate the product and help Arista become a leading player in the Campus space.
- You will work closely with sales and support teams to push new solutions, understand customer needs and pain points and help resolve escalations.
- You will not be limited to a single aspect of the product, it will be broad encompassing many different aspects including but not limited to developing new Access Points, designing and implementing new features, tracking new technologies and working closely with the sales and customer teams.
Requirements:
• Strong engineering and Computer Science fundamentals
• Expected to have a strong background in software development and good understanding of systems and networking areas with the knowledge of the WiFi area as an added bonus.
• Minimum 4+ years of relevant experience
• Well versed with programming in one of C/C++ languages
• Experience working in a Linux environment, developing applications or Linux drivers
• Proven experience in any of the below:
- Network device drivers, operating system internals, Kernels, compilers, SOC architecture
- Experience in developing Wi-Fi features (802.11), WLAN MAC Protocol, system integration, evaluate various performance parameters.
- User space development for connectivity related products (Wireless Lan access points/ controllers, networking equipment) in one or more of following areas:
• HostAPD, Portal, RADIUS, AAA, Identity and role management, Radsec
• Tunnels, Firewall, Iptables, Flow Classification, QoS, TLS, DTLS Preferred Skills
• Experience with Wi-Fi device drivers on Linux.
• Hands-on experience in working with one or more WIFI chipset platforms
• Good System Level understanding of the Wireless AP functionality
• Experience in developing Wi-Fi features, system integration, evaluate various performance parameters
Resources:
- Arista Cognitive WiFi : https://www.arista.com/en/products/cognitive-wifi https://youtu.be/cT1INdR-xHQ https://www.youtube.com/watch?v=olPkCOT3MdA
- Arista Cognitive WiFi Datasheet: https://www.arista.com/assets/data/pdf/Datasheets/CloudVision-Wifi-Datasheet.pdf
- Arista's Approach to Software with Ken Duda (CTO): https://youtu.be/TU8yNh5JCyw
- Additional information and resources can be found at https://www.arista.com/en/


About the role: Looking for Software Developers who like to innovate and solve complex problems. We recognize that strength comes from diversity, and we will embrace your unique skills, curiosity, drive, and passion while giving you the opportunity to grow technically and as an individual.
Responsibilities:
- Work with an open-source iOS ecosystem and the libraries available for common tasks.
- Work directly with developers and product managers to conceptualize, build, test and realize products.
- Build reusable iOS software components for the platform.
- Unit-test code for robustness, including edge cases, usability, and general reliability.
- Work on bug fixing to improve application performance and shipping new features as required
Requirements:
- Experience with iOS Design patterns, memory management, REST web-services and JSON Parsing. Designing and building advanced applications for the iOS platform
- Collaborating with cross-functional teams to define, design, and ship new features
- Design and build advanced applications for the iOS platform
- Work on bug fixing and improving application performance
- Can integrate apps with backend and 3rd party API’s.
- Good programming skills in Objective-C/Swift and extensive knowledge of Apples SDKs and frameworks like Core Data, Core Graphics, Foundation, UIKit, etc.
- Have published one or more applications in an iOS App Store.
- Knowledge of iOS App Store deployment process.


Srijan Technologies is hiring for Delivery Manager - Data Science/Data Engineering
with a permanent WFH option.
Notice: Immediate joiners or candidates with 30 days of notice period are preferred.
Required Skills/Experience: • Overall Experience of 10+ years in the Industry • Good experience in Agile methodology • Experience in delivering Data engineering and data integration projects is a must • Experience in delivering Analytical programs is a must • Experience in delivering AI / ML projects will be an added advantage • Analytical skills • Well-developed interpersonal skills • Commercial awareness • Effective Communication • Team working skills • Ability to motivate people • Management and leadership skills
Knowledge of Hadoop ecosystem installation, initial-configuration and performance tuning.
Expert with Apache Ambari, Spark, Unix Shell scripting, Kubernetes and Docker
Knowledge on python would be desirable.
Experience with HDP Manager/clients and various dashboards.
Understanding on Hadoop Security (Kerberos, Ranger and Knox) and encryption and Data masking.
Experience with automation/configuration management using Chef, Ansible or an equivalent.
Strong experience with any Linux distribution.
Basic understanding of network technologies, CPU, memory and storage.
Database administration a plus.
Qualifications and Education Requirements
2 to 4 years of experience with and detailed knowledge of Core Hadoop Components solutions and
dashboards running on Big Data technologies such as Hadoop/Spark.
Bachelor degree or equivalent in Computer Science or Information Technology or related fields.
- Strong knowledge of JavaScript.
- Knowledge of and its frameworks.
- Knowledge of NodeJS deployment over servers like AWS, the digital ocean.
- Good understanding of relational databases. Having experience with MongoDB will be an advantage.
- Proficiency in handling server-side development, deployment, and debugging.
- Having worked in socket programming is an added advantage.
- Good understanding of code versioning tools, such as Git.


What is the work?
- You will be a part of Technology Team, involved in all the stages of architecture to development of our web based applications product.
- Designing and developing applications using Microsoft Technologies (ASP.NET MVC, C#, ASP.NET Core) with Angular or React JS
- Writing detailed programs through discussion with clients, clarifying what actions the program is intended to perform
What skills and experience are we looking for?
- Minimum 3 years of working experience in .NET based application.
- Must have knowledge with Asp.net Core 2.0 or higher, Web API & C#.
- Must have atleast 1 year of current working experience into asp.net core along with either Angular or React JS.
- A good understanding of modern front-end web development techniques including HTML5, CSS3, JavaScript, knockout, angular, react js, and jQuery libraries.
-
Experience of debugging and troubleshooting websites and web applications.
- Knowledge of Azure and DevOps process and worked with 'Source control' (SVN/TFS/GIT/VSTS).
- Knowledge of Agile Development Methodologies like SCRUM.
- Willingness to learn and improve.
- Should possess strong problem-solving skills.
- Good Communication skills
- Has the ability to do multitasking and assist others by maintaining strong communication with colleagues and clients .

