11+ Outside sales Jobs in Hyderabad | Outside sales Job openings in Hyderabad
Apply to 11+ Outside sales Jobs in Hyderabad on CutShort.io. Explore the latest Outside sales Job opportunities across top companies like Google, Amazon & Adobe.

- Demonstrate, present and promote services to salons, spas & gyms (Beauty & Wellness Industry)
- Should have excellent verbal & presentation skills
- Knowlege of local language is a must
- Should be inovative and target driven
- Willing to travel
- Demonstrate honesty & integrity
Salesforce QA Lead
Exp: 10+ years
Location: Hyderabad
Work mode: 5 Days work from office
Employment type: Contract
Job Description:
· 10+ years of experience leading manual testing efforts, developing test plans, and managing defect lifecycle to ensure high-quality software delivery.
· Experience in Salesforce Automation testing.
· Create, review, and execute detailed test plans, test cases, and test scenarios based on functional and business requirements.
· Strong knowledge of QA testing methodologies that cover all aspects of software testing, including functional testing, UI/UX testing and usability testing.
· Collaborate with multiple stakeholders to ensure thorough understanding of requirements.
Requirements:
- Excellent knowledge of Core Java (J2SE) and J2EE technologies.
- Hands-on experience with RESTful services and API design is a must.
- Knowledge of microservices architecture is a must.
- Knowledge of design patterns is a must.
- Strong knowledge of Exception handling and logging mechanism is a must.
- Agile scrum participation experience. Work experience with several agile teams on an application built with microservices and event-based architectures to be deployed in hybrid (on-prem/cloud) environments.
- Good knowledge of Spring framework (MVC, Cloud, Data and Security. Etc) and ORM framework like JPA/Hibernate.
- Experience in managing the Source Code Base through Version Control tools like SVN, GitHub, Bitbucket, etc.

Data Engineer
Mandatory Requirements
- Experience in AWS Glue
- Experience in Apache Parquet
- Proficient in AWS S3 and data lake
- Knowledge of Snowflake
- Understanding of file-based ingestion best practices.
- Scripting language - Python & pyspark
CORE RESPONSIBILITIES
- Create and manage cloud resources in AWS
- Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies
- Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform
- Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations
- Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
- Define process improvement opportunities to optimize data collection, insights and displays.
- Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible
- Identify and interpret trends and patterns from complex data sets
- Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders.
- Key participant in regular Scrum ceremonies with the agile teams
- Proficient at developing queries, writing reports and presenting findings
- Mentor junior members and bring best industry practices
QUALIFICATIONS
- 5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales)
- Strong background in math, statistics, computer science, data science or related discipline
- Advanced knowledge one of language: Java, Scala, Python, C#
- Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake
- Proficient with
- Data mining/programming tools (e.g. SAS, SQL, R, Python)
- Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)
- Data visualization (e.g. Tableau, Looker, MicroStrategy)
- Comfortable learning about and deploying new technologies and tools.
- Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines.
- Good written and oral communication skills and ability to present results to non-technical audiences
- Knowledge of business intelligence and analytical tools, technologies and techniques.
Familiarity and experience in the following is a plus:
- AWS certification
- Spark Streaming
- Kafka Streaming / Kafka Connect
- ELK Stack
- Cassandra / MongoDB
- CI/CD: Jenkins, GitLab, Jira, Confluence other related tools
Informatica PowerCenter, Informatica Change Data Capture, Azure SQL, Azure Data Lake
Job Description
Minimum of 15 years of Experience with Informatica ETL, Database technologies Experience with Azure database technologies including Azure SQL Server, Azure Data Lake Exposure to Change data capture technology Lead and guide development of an Informatica based ETL architecture. Develop solution in highly demanding environment and provide hands on guidance to other team members. Head complex ETL requirements and design. Implement an Informatica based ETL solution fulfilling stringent performance requirements. Collaborate with product development teams and senior designers to develop architectural requirements for the requirements. Assess requirements for completeness and accuracy. Determine if requirements are actionable for ETL team. Conduct impact assessment and determine size of effort based on requirements. Develop full SDLC project plans to implement ETL solution and identify resource requirements. Perform as active, leading role in shaping and enhancing overall ETL Informatica architecture and Identify, recommend and implement ETL process and architecture improvements. Assist and verify design of solution and production of all design phase deliverables. Manage build phase and quality assure code to ensure fulfilling requirements and adhering to ETL architecture.



Requirements:
• Minimum 3 years of experience in React Native.
• Integration of Rest APIs in react native.
• Firm grasp of the JavaScript and Typescript language and its nuances, including ES6+ syntax
• Knowledge of functional and object-oriented programming
• Ability to write well-documented, clean JavaScript code
• Rock solid at working with third-party dependencies and debugging dependency conflicts
• Familiarity with native build tools, like XCode, Gradle, Android studio and IntelliJ
• Proficient understanding of releasing and monitoring apps on IOS App store, Play store.
• Should be able to communicate to backend developers on API building and modifications needed.
• Testing and releasing builds Experience in application testing software’s Appium is preferable.
• Understanding of REST APIs, the document request model, and offline storage
• Experience with automated testing suites, like Jest or Mocha
• Experience with handling production mobile apps.
• Experience with performance testing and optimization.
Responsibilities:
You’ll
• We are looking for a React Native developer interested in building performant mobile apps on both the iOS and Android platforms.
• You will be responsible for architecting and building these applications, as well as coordinating with the teams responsible for other layers of the product infrastructure.
• Building a product is a highly collaborative effort, and as such, a strong team player with a commitment to perfection is required.
• Build pixel-perfect, buttery-smooth UIs across both mobile platforms.
• Leverage native APIs for deep integrations with both platforms.
• Diagnose and fix bugs and performance bottlenecks for performance that feels native.
• Reach out to the open-source community to encourage and help implement mission-critical software fixes—React Native moves fast and often breaks things.
• Maintain code and write automated tests to ensure the product is of the highest quality.
• Transition existing React web apps to React Native.
It's regarding a permanent opening with Data Semantics
Data Semantics
We are Product base company and Microsoft Gold Partner
Data Semantics is an award-winning Data Science company with a vision to empower every organization to harness the full potential of its data assets. In order to achieve this, we provide Artificial Intelligence, Big Data and Data Warehousing solutions to enterprises across the globe. Data Semantics was listed as one of the top 20 Analytics companies by Silicon India 2018 & CIO Review India 2014 as one of the Top 20 BI companies. We are headquartered in Bangalore, India with our offices in 6 global locations including USA United Kingdom, Canada, United Arab Emirates (Dubai Abu Dhabi), and Mumbai. Our mission is to enable our people to learn the art of data management and visualization to help our customers make quick and smart decisions.
Our Services include:
Business Intelligence & Visualization
App and Data Modernization
Low Code Application Development
Artificial Intelligence
Internet of Things
Data Warehouse Modernization
Robotic Process Automation
Advanced Analytics
Our Products:
Sirius – World’s most agile conversational AI platform
Serina
Conversational Analytics
Contactless Attendance Management System
Company URL: https://datasemantics.co
JD:
MSBI
SSAS
SSRS
SSIS
Datawarehousing
SQL



and Rails framework along with PostgreSQL database. Someone who is passionate
about coding and loves to work in an ongoing challenging environment. You will be part
of a talented software team. You have to consistently deliver in a fast paced
environment and should be more than willing to build software that people love to use.
Key Responsibilities
The individual role that you’ll play in our team:
● Developing large multi-tenant applications in Rails.
● Understanding Rails best practices and religiously introducing those to our
codebase.
● Knowledge on how to do effective Refactoring.
● Ability to write unit tests and following those practices religiously.
● Working closely with the Product managers and UX team.
● Helping QAs to write automated integration tests.
● Staying up-to-date with current and future Backend technologies and
architectures.
Read the ‘Skills and Experience’ section, it is not the usual yada yada, you’ll be
asked specific questions on these.
Skills and Experience
● Ruby on Rails architecture best practices
● Knowledge on the latest versions on ROR
● Strong OOP knowledge in Ruby.
● Asynchronous Networking in Ruby
● Designing RESTFul HTTP APIs using JSON-Schema or JSON API (jsonapi.org).
● Ability to architect and develop API only backend
● Experience in using ActiveRecordSerializer
● Understanding O-Auth2 or JWT (Json Web Token) authentication mechanisms.
● How to use RSpec
● Rails Security Best Practices
● PostgreSQL and Rails.
● SQL concepts like Joins, Relationships etc.
● Understanding DB Partition strategies.
● Knowledge about refactoring ActiveRecord Models (read this - “7 Patterns to
Refactor Fat ActiveRecord Models”).
● Understanding scaling strategies for a high-traffic Rails applications (2 million+
requests a day).
● Background Job processing using Redis and Sidekiq
● Experience in using Amazon Web Services (AWS) tools.
● Writing automated Deployment Scripts using Capistrano, Ansible etc.
● Sending emails in Rails
● Knowledge in Linux and Git is mandatory
- B.E/B.Tech or any relevant Masters degree from reputed college
- 2 to 4 years of experience with Java / J2EE application development stack
- Experience with: Spring framework(Core, JPA, Security, Boot), Hibernate, Secure APIs using JWT/OAUTH
- Good understanding of OOPS concepts
- Proficient understanding of code quality standards
- Solid experience in design, coding, unit testing and debugging
Good to have: Optional
- Working experience with microservices, sonar, docker, K8s, jenkins
- Knowledge of AWS cloud




