
- To be successful as a Telemarketer you should have a positive attitude and excellent telephone etiquette.
- The ideal Telemarketer will remain calm and professional under pressure and always treat clients with respect.
- Contact potential clients telephonically.
- Read the prepared script when pitching the company's products and services.
- Provide any information that the client requests.

Similar jobs
About Company:
GEVME is a Singapore based fast growing leading virtual & hybrid event and engagement platform for building unique experiences. It is used by event professionals worldwide to build, operate and monetise virtual events for some of the biggest brands. The flexibility of the platform provides them with limitless possibilities to turn any virtual event idea into reality. We have already powered hundreds of thousands of events around the world for clients like Facebook, Netflix, Starbucks, Forbes, MasterCard, Singapore Government.
We are a product company with a strong engineering and family culture; we are always looking for new ways to enhance the event experience and empower efficient event management. We’re on a mission to groom the next generation of event technology thought leaders as we grow.
Join us if you want to become part of a vibrant and fast-moving product company that's on a mission to connect people around the world through events.
Do check out our platform GEVME.
Location: Remote/Work From Home
What you'll be doing:
- Writing reusable, testable, and efficient code in Node.js for back-end services.
- Ensuring optimal and high-performance code logic for the data from/to the database.
- Collaborating with front-end developers on the integrations.
- Implementing effective security protocols, data protection measures, and storage solutions.
- Preparing technical specification documents for the developed features.
- Providing technical recommendations and suggesting improvements to the product.
- Writing unit test cases for APIs.
- Documenting code standards and practicing it.
- Staying updated on the advancements in the field of Node.js development.
- Should be open to new challenges and be comfortable in taking up new exploration tasks.
Skills:
- 4-6 years of strong proficiency in Node.js and its core principles.
- Experience in test-driven development.
- Experience with NoSQL databases like MongoDB is required
- Experience with MySQL database
- RESTful/GraphQL API design and development
- Docker and AWS experience is a plus
- Extensive knowledge of JavaScript, PHP, web stacks, libraries, and frameworks.
- Strong interpersonal, communication, and collaboration skills.
- Exceptional analytical and problem-solving aptitude
- Experience with version control system like Git
- Knowledge about the Software Development Life Cycle Model, secure development best practices and standards, source control, code review, build and deployment, continuous integration

The Sr AWS/Azure/GCP Databricks Data Engineer at Koantek will use comprehensive
modern data engineering techniques and methods with Advanced Analytics to support
business decisions for our clients. Your goal is to support the use of data-driven insights
to help our clients achieve business outcomes and objectives. You can collect, aggregate, and analyze structured/unstructured data from multiple internal and external sources and
patterns, insights, and trends to decision-makers. You will help design and build data
pipelines, data streams, reporting tools, information dashboards, data service APIs, data
generators, and other end-user information portals and insight tools. You will be a critical
part of the data supply chain, ensuring that stakeholders can access and manipulate data
for routine and ad hoc analysis to drive business outcomes using Advanced Analytics. You are expected to function as a productive member of a team, working and
communicating proactively with engineering peers, technical lead, project managers, product owners, and resource managers. Requirements:
Strong experience as an AWS/Azure/GCP Data Engineer and must have
AWS/Azure/GCP Databricks experience. Expert proficiency in Spark Scala, Python, and spark
Must have data migration experience from on-prem to cloud
Hands-on experience in Kinesis to process & analyze Stream Data, Event/IoT Hubs, and Cosmos
In depth understanding of Azure/AWS/GCP cloud and Data lake and Analytics
solutions on Azure. Expert level hands-on development Design and Develop applications on Databricks. Extensive hands-on experience implementing data migration and data processing
using AWS/Azure/GCP services
In depth understanding of Spark Architecture including Spark Streaming, Spark Core, Spark SQL, Data Frames, RDD caching, Spark MLib
Hands-on experience with the Technology stack available in the industry for data
management, data ingestion, capture, processing, and curation: Kafka, StreamSets, Attunity, GoldenGate, Map Reduce, Hadoop, Hive, Hbase, Cassandra, Spark, Flume, Hive, Impala, etc
Hands-on knowledge of data frameworks, data lakes and open-source projects such
asApache Spark, MLflow, and Delta Lake
Good working knowledge of code versioning tools [such as Git, Bitbucket or SVN]
Hands-on experience in using Spark SQL with various data sources like JSON, Parquet and Key Value Pair
Experience preparing data for Data Science and Machine Learning with exposure to- model selection, model lifecycle, hyperparameter tuning, model serving, deep
learning, etc
Demonstrated experience preparing data, automating and building data pipelines for
AI Use Cases (text, voice, image, IoT data etc. ). Good to have programming language experience with. NET or Spark/Scala
Experience in creating tables, partitioning, bucketing, loading and aggregating data
using Spark Scala, Spark SQL/PySpark
Knowledge of AWS/Azure/GCP DevOps processes like CI/CD as well as Agile tools
and processes including Git, Jenkins, Jira, and Confluence
Working experience with Visual Studio, PowerShell Scripting, and ARM templates. Able to build ingestion to ADLS and enable BI layer for Analytics
Strong understanding of Data Modeling and defining conceptual logical and physical
data models. Big Data/analytics/information analysis/database management in the cloud
IoT/event-driven/microservices in the cloud- Experience with private and public cloud
architectures, pros/cons, and migration considerations. Ability to remain up to date with industry standards and technological advancements
that will enhance data quality and reliability to advance strategic initiatives
Working knowledge of RESTful APIs, OAuth2 authorization framework and security
best practices for API Gateways
Guide customers in transforming big data projects, including development and
deployment of big data and AI applications
Guide customers on Data engineering best practices, provide proof of concept, architect solutions and collaborate when needed
2+ years of hands-on experience designing and implementing multi-tenant solutions
using AWS/Azure/GCP Databricks for data governance, data pipelines for near real-
time data warehouse, and machine learning solutions. Over all 5+ years' experience in a software development, data engineering, or data
analytics field using Python, PySpark, Scala, Spark, Java, or equivalent technologies. hands-on expertise in Apache SparkTM (Scala or Python)
3+ years of experience working in query tuning, performance tuning, troubleshooting, and debugging Spark and other big data solutions. Bachelor's or Master's degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of study or equivalent work experience
Ability to manage competing priorities in a fast-paced environment
Ability to resolve issues
Basic experience with or knowledge of agile methodologies
AWS Certified: Solutions Architect Professional
Databricks Certified Associate Developer for Apache Spark
Microsoft Certified: Azure Data Engineer Associate
GCP Certified: Professional Google Cloud Certified
Skills required
· Certified in Google Analytics (any popular certifier)
· A minimum of 4+ years of experience in the web analytics domain
· Consolidated knowledge of the GA4 platform
· Excellent communication skills (written & verbal)
· Good understanding of Analytics, Tag Manager, and Data Studio.
· Basic knowledge of BigQuery & JavaScript
· Familiar with HTML, CSS and Javascript, and will have the ability to read, reuse and customise code
· Advanced understanding of Events Tagging and Custom dimensions.
· Hands-on experience with data visualization (custom dashboards)
Roles and Responsibilities
· Responsible for managing our Analytics (G4) and GTM accounts
· Implementation of highly effective web data analytic solutions
· Visualize customer behavioral data such as page/path analysis, clickstream, funnel progression, CTA optimization, page/site abandonment, etc.
· Implement tagging and configuration that ensures data collection is accurate and adequate based on the business needs
In-depth knowledge in the following areas.
· Determine tracking requirements.
· Create functional and technical design (Dom Scrapping & Custom HTML) of the tags.
· Install GTM container tag on the live/staging websites.
· Create, publish and test tags on the live/staging websites.
· Able to track the successful form submission.
· Do regular Tag audits for the live/staging websites.
· Coordinate with the marketing heads to understand and develop custom reports and dashboards.
· Create specified digital analytics tagging standards to ensure robust and consistent data capturing.
· Testing and validating the tracking using various debugging tools.
Application Integration Engineer -
Key Skills:
· Knowledge of various Integrations approaches:
o Native connectors
o API – schema, design
o ODBC
o SFTP files
· Experience integrating SaaS solutions.
o Financial applications:
§ Ideally: Oracle NetSuite, Workday Adaptive
§ Nice to have: Coupa, Concur, Expensify, Avalara
o CRM applications:
§ Ideally: Microsoft Dynamics
§ Nice to have: Salesforce, Hubspot Sales, various sales tools
· Experience with key Microsoft Azure services: mainly data extraction (i.e., Data Factory), databases (SQL), storage (Data Lake), analytics
· Ability to work with data, including ETL
· Nice to have - Knowledge of coding languages: Java script, XML, REST, etc.
· Experience working with global teams
· Ability to overlap some work hours with US EST/CT time
· Solid communication skills
· Proactive, takes initiative, outspoken
Experience
· 6 - 9 years
- Design and Build sophisticated and highly salable apps using Flutter.
- Translate and Build the designs into high-quality responsive UI code.
- Write efficient queries for core Data.
- Use of Model-View-Controller (MVC) and Model-View-ViewModel-Controller (MVVM) architecture and develop maintainable, testable and functional software that meets product requirements.
- Resolve any problems existing in the system and suggest and add new features in the complete system.
- Follow the best practices while developing the app.
- Use CI/CD for smooth deployment.
- Document the project and code efficiently.
- Manage the code and project on Git to keep in sync with other team members and managers.
- Suggest new features and/or enhancements.
- Maintaining software through product life cycle including design, development, verification and bug fixes.
- Write tests for the App.
- Knowledge of different state management libraries like BloC, GetX, Provider will be a plus point.
Minimum of 8 years of experience of which, 4 years should be of applied data mining
experience in disciplines such as Call Centre Metrics.
Strong experience in advanced statistics and analytics including segmentation, modelling, regression, forecasting etc.
Experience with leading and managing large teams.
Demonstrated pattern of success in using advanced quantitative analytic methods to solve business problems.
Demonstrated experience with Business Intelligence/Data Mining tools to work with
data, investigate anomalies, construct data sets, and build models.
Critical to share details on projects undertaken (preferably on telecom industry)
specifically through analysis from CRM.

Recently Expertrons raised a funding of $2.3 Mn in the Pre Series A round witnessing investors like Yoga Capital, Venture Catalysts, Auxano Capital, and existing investors LetsVenture, Ivycap Ventures, Iceland Venture Studios, Nikhil Vora (MD, Sixth Sense), and more.
Website: https://www.google.com/url?q=https://www.expertrons.com/&sa=D&source=editors&ust=1632723867004000&usg=AFQjCNEfP9iCbbT6MjFXH6uESAJ_LSbiUw">https://www.expertrons.com/
Expertrons Android App: https://www.google.com/url?q=http://bit.ly/expertrons&sa=D&source=editors&ust=1632723867005000&usg=AFQjCNEPdzkaKsW0le4tCDbxTVkfoKx1wA">bit.ly/expertrons
Product Demo Video: https://www.google.com/url?q=https://bit.ly/3sFG0G1&sa=D&source=editors&ust=1632723867005000&usg=AFQjCNFHdgcw-AcUqrEALvAOKrMlwWH54g">https://bit.ly/3sFG0G1
Watch Explainer Video: https://www.google.com/url?q=https://bit.ly/3zeBmRM&sa=D&source=editors&ust=1632723867005000&usg=AFQjCNHhRW34qC2qeEREheVGUUhhD3-Hkg">https://bit.ly/3zeBmRM
Responsibilities and Duties
1. Build pixel-perfect, buttery smooth UIs across both mobile platforms
2. Strong knowledge of React workflows (such as Flux or Redux)
3. Implement clean, modern, smooth animations and transitions that provide an excellent user experience
4. Integrate third-party APIs
5. Release applications to IOS and Google Play stores
6. Ability to work through new and difficult React Native issues and contribute to libraries as needed
7. Ability to create and maintain continuous integration and delivery of React Native applications
8. Experience with code optimization and performance improvements
9. Experience in Mongo DB, Node JS, AWS is a plus
10. Strong knowledge of GIT
Required Skills:
1. Strong Algo/DS/Troubleshooting/Problem solving skills & analytical skills
2. Database - React Native, Redux, NOSQL, Redux Saga, Flux, Angular
The idea is not to live forever, but to create something that will!
Be a part of our growing team and climb up the ladder of success. So, if you have the expertise, skills, quirks that can help you add value to Expertrons, then apply with us, now!

AWS Data Engineer:
Job Description
-
3+ years of experience in AWS Data Engineering.
-
Design and build ETL pipelines & Data lakes to automate ingestion of structured and unstructured data
-
Experience working with AWS big data technologies (Redshift, S3, AWS Glue, Kinesis, Athena ,DMS, EMR and Lambda for Serverless ETL)
-
Should have knowledge in SQL and NoSQL programming languages.
-
Have worked on batch and real time pipelines.
-
Excellent programming and debugging skills in Scala or Python & Spark.
-
Good Experience in Data Lake formation, Apache spark, python, hands on experience in deploying the models.
-
Must have experience in Production migration Process
-
Nice to have experience with Power BI visualization tools and connectivity
Roles & Responsibilities:
-
Design, build and operationalize large scale enterprise data solutions and applications
-
Analyze, re-architect and re-platform on premise data warehouses to data platforms on AWS cloud.
-
Design and build production data pipelines from ingestion to consumption within AWS big data architecture, using Python, or Scala.
-
Perform detail assessments of current state data platforms and create an appropriate transition path to AWS cloud.
We have job openings for our Client E Productivity Services. Kindly find the details below:
Payroll will be ToppersEdge.com India Pvt Ltd working for the client E Productivity Services.
(C2H position – 3rd Party payroll for long term)
Job Descriptiom
Experience – 5-10 years
Minimum 4 + years of experience in Automation testing, Java and Selenium
Mandatory Skills - Automation testing, Java and Selenium
Look for 0-20 days notice period.
Budget is maximum 13 LPA.


Full Stack Software Developers are required to work in teams with other Front and Back end Developers to ensure all elements of the software creation are realized optimally. This requires excellent appreciation of art and engineering, as well as communication and interpersonal skills.
You'll be working for a New Delhi based Advertising Agency, founded by successful executives with proven track record.
Responsibilities
● Develop new interface between various software components
● Build reusable code and libraries for future use
● Ensure the technical feasibility of the design
● Optimize application for maximum speed and scalability
● Assure that all user input is validated before submitting to back-end
● Collaborate with other team members and stakeholders
Skills And Qualifications
● English is a must.
● Proficient in cross-platform front-end design and tools for web and mobile.
● Proficient understanding of web markup, including HTML5, CSS3
● Basic understanding of server-side CSS pre-processing platforms, such as LESS and SASS
● Proficient understanding of client-side scripting and JavaScript frameworks, including jQuery
● Note: Every developer is expected to have a proficient knowledge of JavaScript/Typescript
● Good understanding of various JavaScript libraries and frameworks, especially AngularJS,
● Good if you have understanding of asynchronous request handling, partial page updates, and PWAs ( Progressive Web Apps )
● Proficient understanding of cross-browser compatibility issues and ways to work around them.
● Proficient understanding of code versioning tools, such as Git
● You will be working with cutting-edge technologies, so a good understanding of the following software components/architecture will be helpful: NODEJS
Education : BS Computer Science or Equivalent

