
Candidate Should have the Knowledge of Gold Quality, Diamonds, Pearls etc.
Candidate Should be handling the enquiries and turn into sales.
Candidate Should be Smart, Fair and Good Looking.

Similar jobs
Domain: Automation Testing – Selenium Job Purpose
• Thought leadership - Consult with the client and setup/guides a team to strategize, design, implement and run automation solution supporting variety of applications
• Technical leadership - Identify, evaluate, recommend, and optimize current automated testing processes and tools
• Define and implement test automation strategy including roadmap, tools, framework, and approach across project teams
• Oversee technological delivery of the recommended automation solution roadmap across project teams
• setting up end to end QA processes across Requirement Analysis, Test Strategy/Design and Test reporting
• Participate in design and architectural reviews of proposed automation solutions across project teams
• Designing & implementing enterprise-wide QA strategy for variety of clients including complex applications across multiple tech stacks (involving both functional and automation testing)
• Planning, estimating (should be aware of robust models and how to use them) and tracking team’s work
• Status reporting: Track and report upon testing activities, including testing results, test case coverage, required resources, defects discovered and their status etc.
• Mentors/guides the team on technical knowhow
• Adhere to company project standards and guidelines Mandatory Skills & Experience
• 7-11 years of managing team experience and at least 4+ years of in-depth experience in establishing test automation frameworks, evaluating tools, and implementing continuous testing strategy
• Hands on framework development for a Green Field Project
• Strong hands-on experience in Java/JavaScript programming languages; java collection frameworks • Strong experience on QA Automation tools such as Selenium/Cucumber/Appium/SOAPUI etc.
• Strong exposure on UI or API automation, hands on Webservices/Microservices automation
• Experience in CI/CD tools, such as Gitlab(preferable) / Jenkins
• Experience in Cloud tech preferably AWS
• Experience on Cloud Platforms such as Sauce Labs, Perfecto
• Functional tools like Selenium (mandate), Web driver IO, NightWatchJS, etc
• Should have basic knowledge of JMeter or any other equivalent Performance testing tool
• Sound Knowledge of methodologies and approaches such as Agile, BDD, DevOps etc.
• Industry experience on Financial Services is preferred.
About US Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally enabled state, both in the way they work and the way they serve their customers. We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting and customer experience with agile engineering and problem-solving creativity. United by our core values and our purpose of helping people thrive in the brave pursuit of next, our 20,000+ people in 53 offices around the world combine experience across technology, data sciences, consulting and customer obsession to accelerate our clients’ businesses through designing the products and services their customers truly value Additional Information
• Gender Neutral Policy
• 18 paid holidays throughout the year for NCR/BLR (22 For Mumbai)
• Generous parental leave and new parent transition program
• Flexible work arrangements • Employee Assistance Programs to help you in wellness and well being
Hiring for Azure Data Engineers.
Location: Bangalore
Employment type: Full-time, permanent
website: www.amazech.com
Qualifications:
B.E./B.Tech/M.E./M.Tech in Computer Science, Information Technology, Electrical or Electronic with good academic background.
Experience and Required Skill Sets:
• Minimum 5 years of hands-on experience with Azure Data Lake, Azure Data Factory, SQL Data Warehouse, Azure Blob, Azure Storage Explorer
• Experience in Data warehouse/analytical systems using Azure Synapse.
Proficient in creating Azure Data Factory pipelines for ETL processing; copy activity, custom Azure development, Synapse, etc.
• Knowledge of Azure Data Catalog, Event Grid, Service Bus, SQL, and Purview.
• Good technical knowledge in Microsoft SQL Server BI Suite (ETL, Reporting, Analytics, Dashboards) using SSIS, SSAS, SSRS, Power BI
• Design and develop batch and real-time streaming of data loads to data warehouse systems
Other Requirements:
A Bachelor's or Master's degree (Engineering or computer-related degree preferred)
Strong understanding of Software Development Life Cycles including Agile/Scrum
Responsibilities:
• Ability to create complex, enterprise-transforming applications that meet and exceed client expectations.
• Responsible for the bottom line. Strong project management abilities. Ability to encourage the team to stick to timelines.
Who we are:
Software is the connective tissue for much of the information economy. At Sudoviz, we are making it super easy for companies to build and operate secure software. Sudoviz is an Application Security Posture Management platform for the enterprise AppSecurity and software development teams. We're a covid-era, fully remote startup transforming the way enterprise software teams do security.
About you:
• We're looking for someone scrappy, hungry, and eager to take on the challenge as a Backend Engineer on our Platform Engineering Team.
• An ideal candidate has strong programming experience and has experience in architecting and building highly scalable systems.
• You are energetic and enthusiastic about learning and teaching.
• You understand and uphold the values and company culture as well as possess a positive mindset and a can-do attitude.
• You should be willing to contribute across all elements of the technical stack while working on our Platform
Minimum qualifications:
• 2+ years of professional backend software development experience
• Experience designing and implementing highly scalable and performant RESTful micro-services using GraphQL
• Proficiency in Python web and data science libraries (flask, Django, pandas, numpy)
• Fluency in database technologies (e.g. RDBMS, NoSQL, and Graph Databases - Neo4j)
• Experience with Version control like Git
• Experience using AWS, Kubernetes, and Docker containers
• Agile/Scrum/Lean development methodology experience
Must have
• Bachelor's degree in Computer Science
• Good project management skills and documentation skills
• Proficiency in Written and Verbal communication
• Positive solution-oriented mindset
• Ability to achieve consensus with peers and clearly share status updates
• Self-motivated and Self-managing individual
• Clearly and concisely communicate about complex technical, architectural problems
Nice to have
1. Passion for Open source contribution
2. Experience in peak performance organizations/product companies
3. Experience working with a remote team with a global culturally diverse team
EverestEngineering -Software Engineering when you need it. High quality, scalable, distributed development teams are ready to help you now. Sustainable software development. Fit for purpose. Doing the right thing both for our customers and for yours.
As a team, we are passionate and motivated by the impact our organization can have on our customers, employees, the industry, and the world. We are here to accelerate software innovation by digitally connecting the global workforce. Through our experience, we know that working with remote teams and getting it right can become a competitive advantage for organizations. We want to provide this advantage to customers and partners that are trying to have a positive impact on the world through software innovation
Excellence is at the forefront of our mission. We see an opportunity to shift the narrative of working with offshore teams - from a frustrating cost-cutting exercise to a beneficial value addition.https://everest.engineering/
Our experience means we understand the problems software companies face trying to build offshore distributed teams because we've been there before ourselves.
We work iteratively together to manage these problems for you, or to provide extra capacity in peak times so that you can do what you do best – deliver amazing innovations and delighting your customers
To see the quality of our code, you can checkout some of our open source projects: https://github.com/everest-engineering" target="_blank">https://github.com/
Specialties-
Big Data, Web Development, Mobile Development, Agile Development, Data Analytics, Software Product Development, and Remote Working
HeadQuarters- Melbourne, Victoria
We love people who in general -
- have a passion to own and create amazing products.
- are able to clearly understand the customer’s problem.
- are a good collaborative problem solver.
- are a really really good team player.
- are open to learn from others and teach others.
- are able to take the meaningful feedback and improve continuously.
- can commit to inclusion, equality & diversity.
- can maintain integrity at work.
You are the one if -
- you love solving problems.
- you have a keen interest in understanding and analysing the customer problems and solutions.
- you possess good listening, communication (verbal & written) and presentation skills.
- you can empathise easily with customers.
- you can facilitate and lead workshops that generate customised business solutions.
To be successful in this role, you need to -
- analyse the as-is system and collaborate with clients to create artefacts (personas, journeys, epics, stories, to-be system etc.) to outline business vision, objective, product roadmap, and a project release plan.
- effectively manage the scope and constraints of the project.
- work in agile teams that help in the successful delivery of a project.
- effectively prioritise and obtain buy-in from all the stakeholders to help the team with the requirements.
- write detailed stories to help the team understand the requirements.
- demonstrate flexibility in picking up things/roles needed for the successful delivery of the project.
- collaborate with the UX team to understand and contribute to the user research & design process.
- be an effective liaison between the client and your team to manage the product backlog and keep an eye on the software delivery.
Exp:8 to 10 years notice periods 0 to 20 days
Job Description :
- Provision Gcp Resources Based On The Architecture Design And Features Aligned With Business Objectives
- Monitor Resource Availability, Usage Metrics And Provide Guidelines For Cost And Performance Optimization
- Assist It/Business Users Resolving Gcp Service Related Issues
- Provide Guidelines For Cluster Automation And Migration Approaches And Techniques Including Ingest, Store, Process, Analyse And Explore/Visualise Data.
- Provision Gcp Resources For Data Engineering And Data Science Projects.
- Assistance With Automated Data Ingestion, Data Migration And Transformation(Good To Have)
- Assistance With Deployment And Troubleshooting Applications In Kubernetes.
- Establish Connections And Credibility In How To Address The Business Needs Via Design And Operate Cloud-Based Data Solutions
Key Responsibilities / Tasks :
- Building complex CI/CD pipelines for cloud native PaaS services such as Databases, Messaging, Storage, Compute in Google Cloud Platform
- Building deployment pipeline with Github CI (Actions)
- Building terraform codes to deploy infrastructure as a code
- Working with deployment and troubleshooting of Docker, GKE, Openshift, and Cloud Run
- Working with Cloud Build, Cloud Composer, and Dataflow
- Configuring software to be monitored by Appdynamics
- Configuring stackdriver logging and monitoring in GCP
- Work with splunk, Kibana, Prometheus and grafana to setup dashboard
Your skills, experience, and qualification :
- Total experience of 5+ Years, in as Devops. Should have at least 4 year of experience in Google could and Github CI.
- Should have strong experience in Microservices/API.
- Should have strong experience in Devops tools like Gitbun CI, teamcity, Jenkins and Helm.
- Should know Application deployment and testing strategies in Google cloud platform.
- Defining and setting development, test, release, update, and support processes for DevOps operation
- Strive for continuous improvement and build continuous integration, continuous development, and constant deployment pipeline (CI/CD Pipeline)
- Excellent understanding of Java
- Knowledge on Kafka, ZooKeeper, Hazelcast, Pub/Sub is nice to have.
- Understanding of cloud networking, security such as software defined networking/firewalls, virtual networks and load balancers.
- Understanding of cloud identity and access
- Understanding of the compute runtime and the differences between native compute, virtual and containers
- Configuration and managing databases such as Oracle, Cloud SQL, and Cloud Spanner.
- Excellent troubleshooting
- Working knowledge of various tools, open-source technologies
- Awareness of critical concepts of Agile principles
- Certification in Google professional Cloud DevOps Engineer is desirable.
- Experience with Agile/SCRUM environment.
- Familiar with Agile Team management tools (JIRA, Confluence)
- Understand and promote Agile values: FROCC (Focus, Respect, Openness, Commitment, Courage)
- Good communication skills
- Pro-active team player
- Comfortable working in multi-disciplinary, self-organized teams
- Professional knowledge of English
- Differentiators : knowledge/experience about
Job Location: India
Job Summary
We at CondeNast are looking for a data science manager for the content intelligence
workstream primarily, although there might be some overlap with other workstreams. The
position is based out of Chennai and shall report to the head of the data science team, Chennai
Responsibilities:
1. Ideate new opportunities within the content intelligence workstream where data Science can
be applied to increase user engagement
2. Partner with business and translate business and analytics strategies into multiple short-term
and long-term projects
3. Lead data science teams to build quick prototypes to check feasibility and value to business
and present to business
4. Formulate the business problem into an machine learning/AI problem
5. Review & validate models & help improve the accuracy of model
6. Socialize & present the model insights in a manner that business can understand
7. Lead & own the entire value chain of a project/initiative life cycle - Interface with business,
understand the requirements/specifications, gather data, prepare it, train,validate, test the
model, create business presentations to communicate insights, monitor/track the performance
of the solution and suggest improvements
8. Work closely with ML engineering teams to deploy models to production
9. Work closely with data engineering/services/BI teams to help develop data stores, intuitive
visualizations for the products
10. Setup career paths & learning goals for reportees & mentor them
Required Skills:
1. 5+ years of experience in leading Data Science & Advanced analytics projects with a focus on
building recommender systems and 10-12 years of overall experience
2. Experience in leading data science teams to implement recommender systems using content
based, collaborative filtering, embedding techniques
3. Experience in building propensity models, churn prediction, NLP - language models,
embeddings, recommendation engine etc
4. Master’s degree with an emphasis in a quantitative discipline such as statistics, engineering,
economics or mathematics/ Degree programs in data science/ machine learning/ artificial
intelligence
5. Exceptional Communication Skills - verbal and written
6. Moderate level proficiency in SQL, Python
7. Needs to have demonstrated continuous learning through external certifications, degree
programs in machine learning & artificial intelligence
8. Knowledge of Machine learning algorithms & understanding of how they work
9. Knowledge of Reinforcement Learning
Preferred Qualifications
1. Expertise in libraries for data science - pyspark(Databricks), scikit-learn, pandas, numpy,
matplotlib, pytorch/tensorflow/keras etc
2. Working Knowledge of deep learning models
3. Experience in ETL/ data engineering
4. Prior experience in e-commerce, media & publishing domain is a plus
5. Experience in digital advertising is a plus
About Condé Nast
CONDÉ NAST INDIA (DATA)
Over the years, Condé Nast successfully expanded and diversified into digital, TV, and social
platforms - in other words, a staggering amount of user data. Condé Nast made the right move
to invest heavily in understanding this data and formed a whole new Data team entirely
dedicated to data processing, engineering, analytics, and visualization. This team helps drive
engagement, fuel process innovation, further content enrichment, and increase market
revenue. The Data team aimed to create a company culture where data was the common
language and facilitate an environment where insights shared in real-time could improve
performance.
The Global Data team operates out of Los Angeles, New York, Chennai, and London. The team
at Condé Nast Chennai works extensively with data to amplify its brands' digital capabilities and
boost online revenue. We are broadly divided into four groups, Data Intelligence, Data
Engineering, Data Science, and Operations (including Product and Marketing Ops, Client
Services) along with Data Strategy and monetization. The teams built capabilities and products
to create data-driven solutions for better audience engagement.
What we look forward to:
We want to welcome bright, new minds into our midst and work together to create diverse
forms of self-expression. At Condé Nast, we encourage the imaginative and celebrate the
extraordinary. We are a media company for the future, with a remarkable past. We are Condé
Nast, and It Starts Here.
- Actively engage in Lead response management and communication to make them sales- ready
- Co-ordinate for Digital Marketing activities like LinkedIn Posts, Cold emails etc.
- Participate in sales meetings & maintain sales tracker
- Maximize business opportunities by promoting services offered by the company to existing clients
- Identify & Onboard new reseller partners
- Plan Business Development Activities
- Co-ordinate with Vendors
- Demonstrate solution to the clients
- Achieve quarterly Sales targets
Candidates with Enterprise IT sales or have prior work experience in an IT company would be preferrable.
Our client is among the rare success stories of Indian tech companies that are publicly listed. They are a pioneer in the adtech space and have built up market leadership at the global level serving fast growing digital native companies with the ability to do better audience targeting and get more ROI on their ad spends.
As a Data Operations (CPS), you will be responsible for overall coordination so as to manage end to end ad campaign delivery. You will also be responsible for interaction with all publishers that the company works with and managing ongoing relationships with them.
What you will do:
- Being responsible for the CPS affiliate channel, disbursement of offers, quality of delivery, margin management, and channel expansion
- Maximization of campaign deliveries in terms of ROI as well as absolute volume while maintaining the required margin targets
- Identifying new affiliates
- Growing business from existing affiliates
- Being responsible for the overall health of all campaigns, managing sales team/client expectations, coordination with the technical operations teams for required integrations between advertisers/clients and meeting margin targets
Desired Candidate Profile
What you need to have:- Graduation/ Post graduation in any discipline from a reputed institution
- 3+ years’ experience in the digital advertising domain, with CPS experience
- Record of working with cross functional teams and external partners
- Ability to multitask in a high pressure environment
- Basic finance understanding to manage the overall revenue and cost reconciliation
- A mindset of troubleshooting/ problem solving as required
- Maniacal attention to detail and very strict adherence to timelines

Job Description for Senior Python Developer
Experience: 4 to 7 years
- Must have strong knowledge in Python 2/Python 3
- Must have a strong knowledge of Web development using the Django web framework and API development experience using Django REST API.
- Flask framework or any other framework’s knowledge is great.
- Ability to write reusable, testable, and efficient code
- Team Handling exposure
- Knowledge of Standard databases like MySql, PostgreSQL and NoSQL databases like MongoDB, Firebase, etc. Able to create database schemas that represent and support business processes.
- Understanding of fundamental design principles behind a scalable application
- Must be good at understanding client requirements.
- Good testing/debugging skills using standard python modules/IDEs
- Deployment experience with standard servers like Apache/AWS etc.
- Experience in testing frameworks known in the industry.
- Having good/basic knowledge of Front end Technologies HTML, CSS, Bootstrap, JavaScript is good.
- Proficient understanding of code versioning tools - GIT
The candidate must have experience in coding part and team handling experience both







