11+ Outside sales Jobs in Hyderabad | Outside sales Job openings in Hyderabad
Apply to 11+ Outside sales Jobs in Hyderabad on CutShort.io. Explore the latest Outside sales Job opportunities across top companies like Google, Amazon & Adobe.
- Demonstrate, present and promote services to salons, spas & gyms (Beauty & Wellness Industry)
- Should have excellent verbal & presentation skills
- Knowlege of local language is a must
- Should be inovative and target driven
- Willing to travel
- Demonstrate honesty & integrity
Job Title: Data Engineer
Location: Hyderabad
About us:
Blurgs AI is a deep-tech startup focused on maritime and defence data-intelligence solutions, specialising in multi-modal sensor fusion and data correlation. Our flagship product, Trident, provides advanced domain awareness for maritime, defence, and commercial sectors by integrating data from various sensors like AIS, Radar, SAR, and EO/IR.
At Blurgs AI, we foster a collaborative, innovative, and growth-driven culture. Our team is passionate about solving real-world challenges, and we prioritise an open, inclusive work environment where creativity and problem-solving thrive. We encourage new hires to bring their ideas to the table, offering opportunities for personal growth, skill development, and the chance to work on cutting-edge technology that impacts global defence and maritime operations.
Join us to be part of a team that's shaping the future of technology in a fast-paced, dynamic industry.
Job Summary:
We are looking for a Senior Data Engineer to design, build, and maintain a robust, scalable on-premise data infrastructure. You will focus on real-time and batch data processing using platforms such as Apache Pulsar and Apache Flink, work with NoSQL databases like MongoDB and ClickHouse, and deploy services using containerization technologies like Docker and Kubernetes. This role is ideal for engineers with strong systems knowledge, deep backend data experience, and a passion for building efficient, low-latency data pipelines in a non-cloud, on-prem environment.
Key Responsibilities:
- Data Pipeline & Streaming Development
- Design and implement real-time data pipelines using Apache Pulsar and Apache Flink to support mission-critical systems.
- Develop high-throughput, low-latency data ingestion and processing workflows across streaming and batch workloads.
- Integrate internal systems and external data sources into a unified on-prem data platform.
- Data Storage & Modelling
- Design efficient data models for MongoDB, ClickHouse, and other on-prem databases to support analytical and operational workloads.
- Optimise storage formats, indexing strategies, and partitioning schemes for performance and scalability.
- Infrastructure & Containerization
- Deploy, manage, and monitor containerised data services using Docker and Kubernetes in on-prem environments.
- Performance, Monitoring & Reliability
- Monitor the performance of streaming jobs and database queries; fine-tune for efficiency and reliability.
- Implement robust logging, metrics, and alerting solutions to ensure data system availability and uptime.
- Identify bottlenecks in the pipeline and proactively implement optimisations.
Required Skills & Experience:
- Strong experience in data engineering with a focus on on-premise infrastructure.
- Strong expertise in streaming technologies like Apache Pulsar, Apache Flink, or similar.
- Deep experience with MongoDB, ClickHouse, and other NoSQL or columnar storage databases.
- Proficient in Python, Java, or Scala for data processing and backend development.
- Hands-on experience deploying and managing systems using Docker and Kubernetes.
- Familiarity with Linux-based systems, system tuning, and resource monitoring.
Preferred Qualifications:
Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field, or an equivalent combination of education and experience.
Additional Responsibilities for Senior Data Engineers :
For those hired as Senior Data Engineers, the role will come with added responsibilities, including:
- Leadership & Mentorship: Guide and mentor junior engineers, sharing expertise and best practices.
- System Architecture: Lead the design and optimization of complex real-time and batch data pipelines, ensuring scalability and performance.
- Sensor Data Expertise: Focus on building and optimizing sensor-based data pipelines and stateful stream processing for mission-critical applications in domains like maritime and defense.
- End-to-End Ownership: Take responsibility for the performance, reliability, and optimization of data systems.
Compensation:
- Data Engineer CTC: 4 - 8 LPA
- Senior Data Engineer CTC: 12 - 16 LPA
Backend Architect:
Technology: node js, DynamoDB / Mongo DB
Roles:
- Design & implement Backend Services.
- Able to redesign the architecture.
- Designing & implementation of application in MVC & Microservice.
- 9+ years of experience developing service-based applications using Node.js.
- Expert-level skills in developing web applications using JavaScript, CSS and HTML5.
- Experience working on teams that practice BDD (Business Driven Development).
- Understanding of micro-service architecture and RESTful API integration patterns.
- Experience using Node.js for automation and leveraging NPM for package management
- Solid Object Oriented design experience, and creating and leveraging design patterns.
- Experience working in a DevOps/Continuous Delivery environment and associated toolsets (i.e. Jenkins, Puppet etc.)
Desired/Preferred Qualifications :
- Bachelor's degree or equivalent experience
- Strong problem solving and conceptual thinking abilities
- Desire to work in a collaborative, fast-paced, start-up like environment
- Experience leveraging node.js frameworks such as Express.
- Experience with distributed source control management, i.e. Git
Primary job role
Develops software/ information systems, by creating & designing sets of new functionalities and installing software solutions and play a leadership role thorough understanding of design methodologies and overall software development lifecycle.
Main duties/responsibilities
- Involve in all stages of the software development process including requirement gathering, design, development, testing, deployment management, issue review and perform maintenance.
- Good analytical and problem-solving skills.
- Learn and maintain up to date knowledge of latest technologies, tools and platforms.
- Responsible for maintaining a high level of expertise in all areas of technology used (or potentially used) by the team.
- Produce high quality and designs by following industry best practices and coding standards.
- Communicate with clients as and when required.
- Responsible for clear communication (both written and verbal) with technical and non-technical contacts (internal and external) and all seniority levels.
- Ability to work both independently and as a part of a team.
- Work with non-technical business teams to understand the functionality, composition, and user requirements.
- Ability to achieve tasks with minimum supervision.
- Participate in estimations.
- Participate in design discussions.
- Provides technical support to customers and employees.
- Provides training and guidance to the new developers.
- Ability to play a leadership role within the team
Experience
- 4-5 years of experience in the relevant field with 1-2 year in SSE role.
Technical Knowledge
- Angular, React/Redux (or Similar Technology)
- Node.js
- RESTfull API integration
- Experience working with Docker and containerized applications
- Experience working with Kubernetes or other container orchestration tools
- Knowledge and understanding of working in AWS environments
- Experience in full stack development (with passion)
- RDBMS and Document Database
- Experience with Agile/Scrum Development Methodology
- Exposure to Continues Integration (CI/ CD)
- Version control systems like Git/ TFS
Qualifications
- Bachelor’s Degree in Computer Science / IT or an equivalent qualification.
Behavioral competencies
- Communication
- Teamwork & collaboration
- Client orientation
Expecting leads to know Angular9, React (Functional, Hooks), Nodejs(NestJS), Test Driven Development (Unit Test)/JEST, Microservice / Container ( Docker )
Job Description:
As a Technical Writer, you will be responsible for working independently on components of an information development project. You will contribute to the team on a variety of projects and call upon team members for guidance as needed. You will contribute to the team and work independently.
Experience Range:
5 - 7 years
Job Responsibilities:
- Strong writing skills, namely a command of grammar, syntax, diction, and the conventions and best practices of writing a variety of technical documents.
- Experience in using the Darwin Information Typing Architecture (DITA).
- Good knowledge and writing experience using XML-based authoring tools such as oXygen XML Editor.
- Create and maintain Online Help, Installation Guide, Upgrade Guide, Deployment Guide, Release Notes, Support Matrix, and other such product documentation deliverables.
- Design, develop, and write technically accurate and comprehensive product documentation that adheres to the MSTP.
- Understanding of core information development processes: content planning, content creation, and content review.
- Fundamental skills with the authoring tools used by the information development teams.
- Fundamental collaboration skills: Ability to work with cross-functional teams and other writers for updates in the product or the process.
- Ensure strict adherence to the delivery schedule by planning, tracking, and delivering as per sprint commitments.
- Participate in daily scrum meetings, development design reviews, and documentation reviews.
Skills Required:
DITA XML, XMetal, Oxygen XML Editor, MSTP, Astoria, Visual Studio, Agile Environment, Software Development Life Cycle (SDLC),
Desired Characteristics:
- Fundamental knowledge of core technical communication concepts, such as topic-based authoring, minimalism, task-oriented design, single-sourcing.
- Experience with content management systems, such as Astoria and Visual Studio.
- Basic skills in editing (QA for documentation) – ability to recognize errors in a variety of information deliverables.
- Working knowledge of Lean/Agile/XP software development processes and how to follow processes for information development.
- Working knowledge of major aspects of software design, development, and QA and how they affect the information development process.
NNIIT is a pioneering Edtech start-up with the goal of empowering individuals in the digital age. We are actively seeking talented individuals to join our dynamic team in Hyderabad. If you meet the qualifications and are passionate about making an impact in the Ed-tech sector, keep reading!
Job Description:
Company: NNIIT (On-Roll)
Salary: 4 LPA Plus Incentives
Job Location: Hyderabad (On site)
Roles & Responsibilities:
- Source new sales opportunities through inbound lead follow-up and outbound cold calls and emails.
- Calling to 250+ calls/day and maintain 3hrs of Talk-time thus booking 10 conductions for the week for each day
- Understand customer needs and requirements. Route qualified opportunities to the appropriate sales executives for further development and closure.
- Close sales and achieve quarterly quotas
- Research accounts, identify key players and generate interest.
- Maintain and expand your database of prospects within your assigned territory Team with channel partners to build pipeline and close deals
- Perform effective online demos to prospects.
FAQs:
Salary Expectation: Minimum 20,000 INR with the potential to reach up to 74,000
INR based on skills, experience, and interview performance.
Eligibility Criteria: Graduates with 1 to 31 years of experience in Ed-tech K9 & K12 background are eligible to apply.
Specific Skills Required: Good sales skills, proficiency in English, and strong communication abilities are essential.
Applicants: Both male and female candidates are welcome to apply.
Responsibilities -
- Collaborate with the development team to understand data requirements and identify potential scalability issues.
- Design, develop, and implement scalable data pipelines and ETL processes to ingest, process, and analyse large - volumes of data from various sources.
- Optimize data models and database schemas to improve query performance and reduce latency.
- Monitor and troubleshoot the performance of our Cassandra database on Azure Cosmos DB, identifying bottlenecks and implementing optimizations as needed.
- Work with cross-functional teams to ensure data quality, integrity, and security.
- Stay up to date with emerging technologies and best practices in data engineering and distributed systems.
Qualifications & Requirements -
- Proven experience as a Data Engineer or similar role, with a focus on designing and optimizing large-scale data systems.
- Strong proficiency in working with NoSQL databases, particularly Cassandra.
- Experience with cloud-based data platforms, preferably Azure Cosmos DB.
- Solid understanding of Distributed Systems, Data modelling, Data Warehouse Designing, and ETL Processes.
- Detailed understanding of Software Development Life Cycle (SDLC) is required.
- Good to have knowledge on any visualization tool like Power BI, Tableau.
- Good to have knowledge on SAP landscape (SAP ECC, SLT, BW, HANA etc).
- Good to have experience on Data Migration Project.
- Knowledge of Supply Chain domain would be a plus.
- Familiarity with software architecture (data structures, data schemas, etc.)
- Familiarity with Python programming language is a plus.
- The ability to work in a dynamic, fast-paced, work environment.
- A passion for data and information with strong analytical, problem solving, and organizational skills.
- Self-motivated with the ability to work under minimal direction.
- Strong communication and collaboration skills, with the ability to work effectively in a cross-functional team environment.
Position:
Oracle Cloud Technical
Location:
Hyderabad or Bangalore
Overview:
Recognized on the Inc. 5000 fastest growing companies in the US, Peloton is one of the largest and fastest growing professional services firms specializing in Integrated Cloud Solutions for Enterprise Resource Planning, Enterprise Performance Management, Supply Chain Management, Human Capital Management and Big Data and Analytics. Peloton has the vision and connected capabilities to help CFOs, CIOs and business leaders to envision, implement and realize the benefits of digital transformation. Companies that are equipped with the right information, have the know-how, and the enabling technology to consistently leverage analytics will gain a competitive advantage. Our people are recognized as some of the best minds and most committed people in the industry. We believe in quality. We appreciate creativity. We recognize individual contributions, and we place trust in our team members. And…we love what we do.
Peloton provides Advisory, Consulting, and Managed services with deep functional and technical expertise specializing in serving clients in the Life Sciences, Retail, Manufacturing, Insurance, Aerospace and Defense and Financial Services industries. Our business and technology professionals provide a unique perspective, proven experience, with an innovative and collaborative approach to achieve results for clients.
If you are interested in being part of our high performing and growing organization – and have strong business and technical expertise; especially as related to Oracle Cloud Applications, Integrations, and Reporting experience, you may be a good fit for our team. Peloton has a unique opportunity for an experienced Oracle Applications Technical Manager to play a hands on role with an ability to lead and manage a growing team of technical consultants.
Responsibilities
Responsibilities will vary depending on the level and experience of the individual. The Oracle Cloud Technical - Manager will provide oversight and work as a hands-on manager supporting a team of developers. The Manager will work as part of a project team to deliver analytical, solution-oriented services to Fortune 1000 clients. Based upon experience, specific responsibilities may include:
- Act as Oracle Technical Lead on projects: leading the architecture, creating technical designs, and leading a team of Cloud technical consultants (Integrations, Reporting & PaaS).
- Provide technical solutions in Oracle Cloud modules and custom development, for the issues raised by the business.
- Understand the Data conversion / migration strategy, gather requirements and propose technical approach.
- Defining new and refining existing solutions using industry best practices for enterprise data management and data integration
- Continuously engage with project managers, Onsite teams to obtain information necessary for a robust solution and smooth delivery.
- Contributing to continuous improvement and development of Peloton processes and intellectual property
- Provide guidance and advice to technical developers to help them improve their technical and professional skills
- Review technical solutions and code to ensure adherence to best practices and to help team achieve industry leading quality and performance of objects assigned to our offshore development center
Required Experience & Skills
· Qualified candidates should have a B. Tech or M.C.A degree from recognized universities.
· 6 to 13 years of experience in Oracle Applications that includes 3-4 years of experience in Oracle cloud Technical (Reporting, Integrations and PaaS development).
· Working knowledge of Oracle Fin, SCM, Mfg modules’ data models and functionality.
· Must have at least 5 years of hands-on integration experience, with deep understanding of SOAP/REST API and multiple integration platforms.
· At least 5 years of SQL experience and a strong understanding of interface mechanisms in and out of Oracle Cloud including Web Services and File Based Data Import and Export (FBDI).
· Must have knowledge of open interfaces, AR Invoices, GL Journal, Projects, Requisitions, PO, Customer, AP Invoices. SCM modules are also required.
· At least 2 years of experience leading team of 10+ resources, with ability to provide mentorship and training to others.
· Experience and expertise in writing technical specifications in a Fusion Cloud/Middleware environment.
· Experience working with customers directly including technical architects, functional consultants and developers.
· Fit with Peloton culture and company values: teamwork, innovation, integrity, service, “can-do” attitude, and speaking your ideas.
· Enthusiastic, energetic, and highly driven, with the desire to learn our business
· Excellent analytical and problem solving skills.
· Strong consultancy skills including consulting experience with excellent written and verbal communication skills.
· Proven ability to work remotely and independently in support of clients and across multiple initiatives.
Additional Desired Skills
· Certifications in Oracle Cloud applications Integrations
· Experience leading solution workshops and mentoring junior staff.
Compensation:
- Competitive base salary with performance based bonus
- Vacation, Leave, and Holiday pay
- Group Medical Insurance
- Group Term Accident & Life Insurance
Peloton Group is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status.
Full Stack Developer at Roarke will be responsible for building scalable software applications and is comfortable around both front-end and back-end coding languages, development frameworks, and third-party libraries. He/she is responsible for developing and designing front end web architecture, ensuring the responsiveness of applications, and working alongside graphic designers for application design features, among other duties
Responsibilities:
- Interface with technical team to determine reporting requirements of a given project
- Developing front-end website architecture.
- Designing user interactions on web pages.
- Developing back-end website applications.
- Creating servers and databases for functionality.
- Ensuring cross-platform optimization for mobile phones.
- Ensuring responsiveness of applications.
- Working alongside graphic designers for web design features.
- Seeing through a project from conception to finished product.
- Designing and developing APIs.
- Meeting both technical and consumer needs.
- Staying abreast of developments in web applications and programming languages.
Required Skills:
- Knowledge of essential front-end technologies like HTML, CSS, JavaScript
- At least one server-side programming language like Java, Python, PHP, Ruby, etc.
- Database management and caching mechanism
- Server and configuration management
- Version control systems like GitHub, GitLab, BeanStalk
- Basic UI/UX design
- Project management skills
- Security awareness
- The agile development approach to carry out the vision with multidisciplinary tasks without any hurry or disturbance.
- Generate an MVP (minimum viable product)
- Problem Solving Skills for you will be expected to be the go-to person during any technical difficulty while creating an application.
Desired Skills:
- Good communication and problem-solving skills.
- Project management skills.
- Problem-solving skills
- Focus on customer satisfaction
Required Experience:
- >4 years’ experience developing mobile and web applications
Required Education:
- A Bachelor’s degree in Computer Science or a related field
Summary
Our Kafka developer has a combination of technical skills, communication skills and business knowledge. The developer should be able to work on multiple medium to large projects. The successful candidate will have excellent technical skills of Apache/Confluent Kafka, Enterprise Data WareHouse preferable GCP BigQuery or any equivalent Cloud EDW and also will be able to take oral and written business requirements and develop efficient code to meet set deliverables.
Must Have Skills
- Participate in the development, enhancement and maintenance of data applications both as an individual contributor and as a lead.
- Leading in the identification, isolation, resolution and communication of problems within the production environment.
- Leading developer and applying technical skills Apache/Confluent Kafka (Preferred) AWS Kinesis (Optional), Cloud Enterprise Data Warehouse Google BigQuery (Preferred) or AWS RedShift or SnowFlakes (Optional)
- Design recommending best approach suited for data movement from different sources to Cloud EDW using Apache/Confluent Kafka
- Performs independent functional and technical analysis for major projects supporting several corporate initiatives.
- Communicate and Work with IT partners and user community with various levels from Sr Management to detailed developer to business SME for project definition .
- Works on multiple platforms and multiple projects concurrently.
- Performs code and unit testing for complex scope modules, and projects
- Provide expertise and hands on experience working on Kafka connect using schema registry in a very high volume environment (~900 Million messages)
- Provide expertise in Kafka brokers, zookeepers, KSQL, KStream and Kafka Control center.
- Provide expertise and hands on experience working on AvroConverters, JsonConverters, and StringConverters.
- Provide expertise and hands on experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connector, JMS source connectors, Tasks, Workers, converters, Transforms.
- Provide expertise and hands on experience on custom connectors using the Kafka core concepts and API.
- Working knowledge on Kafka Rest proxy.
- Ensure optimum performance, high availability and stability of solutions.
- Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices.
- Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms. Leverage Hadoop ecosystem knowledge to design, and develop capabilities to deliver our solutions using Spark, Scala, Python, Hive, Kafka and other things in the Hadoop ecosystem.
- Use automation tools like provisioning using Jenkins, Udeploy or relevant technologies
- Ability to perform data related benchmarking, performance analysis and tuning.
- Strong skills in In-memory applications, Database Design, Data Integration.



![[x]cube LABS](/_next/image?url=https%3A%2F%2Fcdnv2.cutshort.io%2Fcompany-static%2F639877aa0ad87e002533a1c5%2Fuser_uploaded_data%2Flogos%2Fx_whiteB_eeCk0gqs.png&w=256&q=75)



