11+ Double Click for Publisher (DFP) Jobs in Pune | Double Click for Publisher (DFP) Job openings in Pune
Apply to 11+ Double Click for Publisher (DFP) Jobs in Pune on CutShort.io. Explore the latest Double Click for Publisher (DFP) Job opportunities across top companies like Google, Amazon & Adobe.
at YuktaMedia
Position: Call Quality Analyst
Key Responsibilities:
1. Conduct call audits: Listen to recorded or live customer service calls to evaluate the quality of interactions, ensuring adherence to company standards.
2. Quality assessment and control: Evaluate call agents' performance based on established metrics, including communication skills, product knowledge, adherence to scripts, and compliance with company policies.
3. Provide feedback: Deliver constructive and actionable feedback to call agents based on audit results to help them improve their performance and enhance customer satisfaction.
4. Reporting: Prepare detailed reports summarizing audit findings, trends, and areas for improvement. Provide regular reports to the management team.
5. Process improvement: Work with cross-functional teams to identify process improvements that can enhance the overall quality of customer interactions.
6. Compliance: Ensure that all customer service activities comply with legal and regulatory requirements.
7. Customer feedback: Incorporate customer feedback into quality assessment processes
Qualifications:
• Bachelor's degree in a relevant field or equivalent work experience.
• Proven experience in quality analysis and call auditing, preferably in a customer service environment.
• Excellent communication skills, both written and verbal.
• Knowledge of relevant industry regulations and compliance standards.
• Proficiency with Microsoft Office suite (Word, Excel, PowerPoint).
call QA for Pune
language- gujarati, tamil, telugu,malayalam, Punjabi, kannada (any)
Experience- 6months- 2yrs (BPO Voice also considerable)
Salary- 25k ctc location: Vimaan Nagar
call QA for Banglore/Jaipur
language- Hindi andEnglish (both)
Experience- 6months- 2yrs (BPO Voice also considerable)
Salary- 22k ctc LOCATION: HSR layout
- Developing new user-facing features using React.js
- Building reusable components and front-end libraries for future use
- Translating designs and wireframes into high quality code
- Optimizing components for maximum performance across a vast array of web-capable devices and browsers
Looking for immediate joiners for Pune location only who are having good experience into B2b, Saas and in Field sales.
Team handling experience is compulsory.
at Simplifai Cognitive Solutions Pvt Ltd
We are looking for a skilled Senior/Lead Bigdata Engineer to join our team. The role is part of the research and development team, where you with enthusiasm and knowledge are going to be our technical evangelist for the development of our inspection technology and products.
At Elop we are developing product lines for sustainable infrastructure management using our own patented technology for ultrasound scanners and combine this with other sources to see holistic overview of the concrete structure. At Elop we will provide you with world-class colleagues highly motivated to position the company as an international standard of structural health monitoring. With the right character you will be professionally challenged and developed.
This position requires travel to Norway.
Elop is sister company of Simplifai and co-located together in all geographic locations.
Roles and Responsibilities
- Define technical scope and objectives through research and participation in requirements gathering and definition of processes
- Ingest and Process data from data sources (Elop Scanner) in raw format into Big Data ecosystem
- Realtime data feed processing using Big Data ecosystem
- Design, review, implement and optimize data transformation processes in Big Data ecosystem
- Test and prototype new data integration/processing tools, techniques and methodologies
- Conversion of MATLAB code into Python/C/C++.
- Participate in overall test planning for the application integrations, functional areas and projects.
- Work with cross functional teams in an Agile/Scrum environment to ensure a quality product is delivered.
Desired Candidate Profile
- Bachelor's degree in Statistics, Computer or equivalent
- 7+ years of experience in Big Data ecosystem, especially Spark, Kafka, Hadoop, HBase.
- 7+ years of hands-on experience in Python/Scala is a must.
- Experience in architecting the big data application is needed.
- Excellent analytical and problem solving skills
- Strong understanding of data analytics and data visualization, and must be able to help development team with visualization of data.
- Experience with signal processing is plus.
- Experience in working on client server architecture is plus.
- Knowledge about database technologies like RDBMS, Graph DB, Document DB, Apache Cassandra, OpenTSDB
- Good communication skills, written and oral, in English
We can Offer
- An everyday life with exciting and challenging tasks with the development of socially beneficial solutions
- Be a part of companys research and Development team to create unique and innovative products
- Colleagues with world-class expertise, and an organization that has ambitions and is highly motivated to position the company as an international player in maintenance support and monitoring of critical infrastructure!
- Good working environment with skilled and committed colleagues an organization with short decision paths.
- Professional challenges and development
Summary
Our Kafka developer has a combination of technical skills, communication skills and business knowledge. The developer should be able to work on multiple medium to large projects. The successful candidate will have excellent technical skills of Apache/Confluent Kafka, Enterprise Data WareHouse preferable GCP BigQuery or any equivalent Cloud EDW and also will be able to take oral and written business requirements and develop efficient code to meet set deliverables.
Must Have Skills
- Participate in the development, enhancement and maintenance of data applications both as an individual contributor and as a lead.
- Leading in the identification, isolation, resolution and communication of problems within the production environment.
- Leading developer and applying technical skills Apache/Confluent Kafka (Preferred) AWS Kinesis (Optional), Cloud Enterprise Data Warehouse Google BigQuery (Preferred) or AWS RedShift or SnowFlakes (Optional)
- Design recommending best approach suited for data movement from different sources to Cloud EDW using Apache/Confluent Kafka
- Performs independent functional and technical analysis for major projects supporting several corporate initiatives.
- Communicate and Work with IT partners and user community with various levels from Sr Management to detailed developer to business SME for project definition .
- Works on multiple platforms and multiple projects concurrently.
- Performs code and unit testing for complex scope modules, and projects
- Provide expertise and hands on experience working on Kafka connect using schema registry in a very high volume environment (~900 Million messages)
- Provide expertise in Kafka brokers, zookeepers, KSQL, KStream and Kafka Control center.
- Provide expertise and hands on experience working on AvroConverters, JsonConverters, and StringConverters.
- Provide expertise and hands on experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connector, JMS source connectors, Tasks, Workers, converters, Transforms.
- Provide expertise and hands on experience on custom connectors using the Kafka core concepts and API.
- Working knowledge on Kafka Rest proxy.
- Ensure optimum performance, high availability and stability of solutions.
- Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices.
- Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms. Leverage Hadoop ecosystem knowledge to design, and develop capabilities to deliver our solutions using Spark, Scala, Python, Hive, Kafka and other things in the Hadoop ecosystem.
- Use automation tools like provisioning using Jenkins, Udeploy or relevant technologies
- Ability to perform data related benchmarking, performance analysis and tuning.
- Strong skills in In-memory applications, Database Design, Data Integration.
· Primarily responsible for working on various service requests related to maintaining database security (roles, privileges, authentication), creating schema and copying data
· Work on installation, configuration, creation and administration of HP Vertica database clusters and AWS RDS MySQL database.
· Work with platform, product and other support teams to participate in release and change management processes to implement new features/functions as well support mandatory patches or upgrade activities on weekends.
· Responsible for database performance tuning, query tuning and database operations including migration and upgrade.
· Plan and implement periodic upgrades of database schema and perform testing and verification of released database schema upgrades.
· Configure and manage database backup and restore, participate in Disaster Recovery
· Create and maintain documentation, procedures and best practice guides for database deployment and maintenance.
a young, vibrant, fast growing IT service company.
5 years of architecture, design and programming experience preferably in fast-paced dynamic environment
Strong application design and implementation skills, solid understanding of the entire development cycle.
Strong background in Java/J2EE based application
Strong background in Spring/Spring boot based application
Strong background in Microservices based application
Experience working with Apache and/or Tomcat
Experience in transforming requirements to software design
Strong Experience with developing Java SAAS web applications.
Working experience in Industry Standard protocols related API Security including OAuth
Demonstrate strong design and programming skills using JSON, Web Services, XML, XSLT, PL/SQL in Unix and Windows environments.
Strong background working with Linux/UNIX environments.
Strong Shell scripting experience.
Working knowledge with Oracle, DB2 or MongoDB databases.
Passion to stay on top of the latest happenings in the tech world and an attitude to discuss and bring those into play. Strong agile/scrum development experience
Strong collaboration and communication skills within distributed project teams
Excellent written and verbal communication skills
We are looking for a Node.js Developer responsible for managing the interchange of data between the server and the users. Your primary focus will be the development of all server-side logic, definition and maintenance of the central database, and ensuring high performance and responsiveness to requests from the front-end. You will also be responsible for integrating the front-end elements built by your co-workers into the application. Therefore, a basic understanding of front-end technologies is necessary as well.
Responsibilities
- Integration of user-facing elements developed by front-end developers with server side logic
- Writing reusable, testable, and efficient code
- Design and implementation of low-latency, high-availability, and performant applications
- Implementation of security and data protection
- Creating APIs and backend systems
- Scraping sites and creating backend dashboards for data management
- Experience with React.js and native is a plus but not required
- Backend infrastructure management and deployment
- Experience with MongoDB, Express is a plus.
We have a great work culture and offer amazing technical and architectural challenges.
The ideal candidate will be an enthusiastic developer eager to work on the innovative Product.
Qualification : 4 to 8 years of relevant experience in Core Java. Excellent coding skill in Java, Spring / Springboot, React, Javascript