
Job description
- Important Note: Need Strong Communication Skills. Only 2023, 2024 pass outs those who are looking for Internship cum permanent job can apply.
- Must be interested in proceeding a career in Sales.
About Company:
All Time Design is seeking enthusiastic and motivated individuals to join our team as Business Development Executives. We are a dynamic company focused on providing innovative solutions to our clients. As a fresher, you will have the opportunity to learn and grow in a supportive environment while contributing to our business development efforts.
Responsibilities:
- Learn to qualify leads from marketing campaigns as potential sales opportunities
- Assist in contacting potential prospects through various channels, such as cold calls and emails
- Learn to present our company and its offerings to potential prospects
- Gain understanding of prospect's needs and learn to suggest appropriate products/services
- Assist in building and nurturing long-term relationships with prospects
- Shadow experienced team members to learn how to proactively seek new business opportunities in the market
- Support in setting up meetings or calls between prospective customers and Sales Manager
- Provide assistance in reporting sales results to the Sales Manager
Requirements:
- No prior work experience required; fresh graduates are welcome to apply
- Eagerness to learn and develop sales skills
- Strong communication skills, both verbal and written
- Willingness to engage in cold calling, emailing, and other sales prospecting techniques
- Basic understanding of sales concepts and metrics is a plus
- Familiarity with CRM software is a plus
- Ability to adapt and thrive in a fast-paced environment
Join us and kickstart your career in sales with All Time Design!
Job Type: Internship
Contract length: 6 months
Pay: ₹10,000.00 per month
Benefits:
- Flexible schedule
Schedule:
- Day shift
Language:
- English (Required)
Work Location: In person

Similar jobs
GCP Cloud Engineer:
- Proficiency in infrastructure as code (Terraform).
- Scripting and automation skills (e.g., Python, Shell). Knowing python is must.
- Collaborate with teams across the company (i.e., network, security, operations) to build complete cloud offerings.
- Design Disaster Recovery and backup strategies to meet application objectives.
- Working knowledge of Google Cloud
- Working knowledge of various tools, open-source technologies, and cloud services
- Experience working on Linux based infrastructure.
- Excellent problem-solving and troubleshooting skills
Job Summary:
We are seeking a highly skilled OutSystems Senior Developer to design, develop, and optimize enterprise applications using the OutSystems low-code platform. The ideal candidate will have extensive experience in OutSystems development, a strong understanding of software architecture, and the ability to mentor junior developers. You will work closely with business stakeholders, UX designers, and other developers to deliver high-quality applications that meet business needs.
Key Responsibilities:
Design, develop, and implement scalable applications using the OutSystems platform.
Collaborate with business analysts and stakeholders to gather and analyze requirements.
Ensure best practices, performance optimization, and security compliance in OutSystems applications.
Integrate OutSystems applications with external systems and APIs.
Troubleshoot and resolve technical issues, ensuring system stability and performance.
Conduct code reviews and provide guidance to junior developers.
Stay updated with OutSystems best practices, tools, and emerging technologies.
Work in an Agile environment and participate in sprint planning, stand-ups, and retrospectives.
Required Skills & Qualifications:
3+ years of experience in OutSystems development.
Certification in OutSystems (e.g., OutSystems Professional or Expert Developer) is a plus.
Strong knowledge of OutSystems Architecture, Performance Tuning, and Security Best Practices.
Proficiency in SQL, JavaScript, HTML, CSS, and Web Services (REST/SOAP).
Experience in integrating OutSystems applications with third-party systems and databases.
Understanding of Agile methodologies DevOps practices.
Excellent problem-solving skills and ability to work in a fast-paced environment.
Strong communication and teamwork skills.
Preferred Qualifications:
Experience with cloud-based deployments (AWS, Azure, OIC).
Knowledge of CI/CD pipelines and automated testing.
The Sr AWS/Azure/GCP Databricks Data Engineer at Koantek will use comprehensive
modern data engineering techniques and methods with Advanced Analytics to support
business decisions for our clients. Your goal is to support the use of data-driven insights
to help our clients achieve business outcomes and objectives. You can collect, aggregate, and analyze structured/unstructured data from multiple internal and external sources and
patterns, insights, and trends to decision-makers. You will help design and build data
pipelines, data streams, reporting tools, information dashboards, data service APIs, data
generators, and other end-user information portals and insight tools. You will be a critical
part of the data supply chain, ensuring that stakeholders can access and manipulate data
for routine and ad hoc analysis to drive business outcomes using Advanced Analytics. You are expected to function as a productive member of a team, working and
communicating proactively with engineering peers, technical lead, project managers, product owners, and resource managers. Requirements:
Strong experience as an AWS/Azure/GCP Data Engineer and must have
AWS/Azure/GCP Databricks experience. Expert proficiency in Spark Scala, Python, and spark
Must have data migration experience from on-prem to cloud
Hands-on experience in Kinesis to process & analyze Stream Data, Event/IoT Hubs, and Cosmos
In depth understanding of Azure/AWS/GCP cloud and Data lake and Analytics
solutions on Azure. Expert level hands-on development Design and Develop applications on Databricks. Extensive hands-on experience implementing data migration and data processing
using AWS/Azure/GCP services
In depth understanding of Spark Architecture including Spark Streaming, Spark Core, Spark SQL, Data Frames, RDD caching, Spark MLib
Hands-on experience with the Technology stack available in the industry for data
management, data ingestion, capture, processing, and curation: Kafka, StreamSets, Attunity, GoldenGate, Map Reduce, Hadoop, Hive, Hbase, Cassandra, Spark, Flume, Hive, Impala, etc
Hands-on knowledge of data frameworks, data lakes and open-source projects such
asApache Spark, MLflow, and Delta Lake
Good working knowledge of code versioning tools [such as Git, Bitbucket or SVN]
Hands-on experience in using Spark SQL with various data sources like JSON, Parquet and Key Value Pair
Experience preparing data for Data Science and Machine Learning with exposure to- model selection, model lifecycle, hyperparameter tuning, model serving, deep
learning, etc
Demonstrated experience preparing data, automating and building data pipelines for
AI Use Cases (text, voice, image, IoT data etc. ). Good to have programming language experience with. NET or Spark/Scala
Experience in creating tables, partitioning, bucketing, loading and aggregating data
using Spark Scala, Spark SQL/PySpark
Knowledge of AWS/Azure/GCP DevOps processes like CI/CD as well as Agile tools
and processes including Git, Jenkins, Jira, and Confluence
Working experience with Visual Studio, PowerShell Scripting, and ARM templates. Able to build ingestion to ADLS and enable BI layer for Analytics
Strong understanding of Data Modeling and defining conceptual logical and physical
data models. Big Data/analytics/information analysis/database management in the cloud
IoT/event-driven/microservices in the cloud- Experience with private and public cloud
architectures, pros/cons, and migration considerations. Ability to remain up to date with industry standards and technological advancements
that will enhance data quality and reliability to advance strategic initiatives
Working knowledge of RESTful APIs, OAuth2 authorization framework and security
best practices for API Gateways
Guide customers in transforming big data projects, including development and
deployment of big data and AI applications
Guide customers on Data engineering best practices, provide proof of concept, architect solutions and collaborate when needed
2+ years of hands-on experience designing and implementing multi-tenant solutions
using AWS/Azure/GCP Databricks for data governance, data pipelines for near real-
time data warehouse, and machine learning solutions. Over all 5+ years' experience in a software development, data engineering, or data
analytics field using Python, PySpark, Scala, Spark, Java, or equivalent technologies. hands-on expertise in Apache SparkTM (Scala or Python)
3+ years of experience working in query tuning, performance tuning, troubleshooting, and debugging Spark and other big data solutions. Bachelor's or Master's degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of study or equivalent work experience
Ability to manage competing priorities in a fast-paced environment
Ability to resolve issues
Basic experience with or knowledge of agile methodologies
AWS Certified: Solutions Architect Professional
Databricks Certified Associate Developer for Apache Spark
Microsoft Certified: Azure Data Engineer Associate
GCP Certified: Professional Google Cloud Certified
LogiNext is looking for a technically savvy and experienced Software Engineer to take up front-end development efforts. You will design and develop elegant interfaces that exceed client expectations in terms of value and benefit. You will collaborate on scalability issues involving visualizing massive amounts of data and information. You will identify, define and communicate best practices for front end.
You are passionate about building secure, high-performing and scalable front-end applications. Your design intuition inclines towards usability, elegance and simplicity. You are biased towards open-source tools and existing frameworks. You have demonstrated strong inter-personal and communication skills.
Responsibilities
Translate wireframes into functional requirements, write well-abstracted, reusable, high-performance code for UI components Work closely with design, product management and development teams to create elegant, usable, responsive and interactive interfaces Contribute to continual improvement by suggesting improvements to user interface, software architecture or new technologies Optimize web applications to maximize speed and scale. Support diverse clients from high-powered desktop computers to low-end mobile devices Understand and coach/teach others, show a high level of ethics and team-work
Requirements
Bachelor/Master’s degree in Computer Science, Information Technology or a related field 2 to 3 years of experience in developing responsive, elegant, efficient and cross-browser front-end applications using Javascript, HTML5, CSS3 Hands on experience on various MVC frameworks, UI libraries such as ReactJS, D3.js Hands on experience on Map API’s like Google Maps, Openstreetmap and other powerful information-rich maps Familiar with UX processes and design Proven ability to drive large scale projects with deep understanding of Agile SDLC, high collaboration and leadership Excellent written and oral communication skills, judgment and decision making skills
Have a good talent and knowledge
1. To guide and counsel students and parents for Admission in Indian universities and colleges.
2. Interacting with University delegates.
3. Manage Events. Help the students in the application, admission procedures, etc. Participate in educational affairs. Research & Development Activities.
Key skills
1. Candidate should have exposure/knowledge about the education system.
2. Excellent communication & presentation skills.
3. Should have learning & positive attitude communication skills and problem-solving quality.

- Deploying, managing, and operating scalable, highly available, and fault-tolerant systems on AWS.
- Implementing and controlling the flow of data to and from AWS.
- Selecting the appropriate AWS service based on compute, data, or security requirements.
- Identifying appropriate use of AWS operational best practices.
- Estimating AWS usage costs and identifying operational cost control mechanisms.
- Managing & troubleshooting on Linux Based Platforms.
- Ensuring efficient functioning of data storage and processing functions in accordance with company security policies and best practices in cloud security.
- Identifying, analyzing, and resolving infrastructure vulnerabilities and application deployment issues.
- Regularly reviewing existing systems and making recommendations for improvements
Essential Criteria:
- Two or more years of hands-on experience operating AWS-based applications.
- Experience provisioning, operating and maintaining systems running on AWS.
- Strong troubleshooting skills like Jboss, Tomcat on Linux Platforms.
- Explicit knowledge of Computing, Storage, Networking, and Security technologies in the Amazon AWS hosting environment.
- Experience in setting up Cloud Monitoring using CloudWatch or any other tool.
- Knowledge of Load Balancers, Firewalls, and network switching components.
- Ability to identify and gather requirements to define a solution to be built and operated on AWS.
- Capabilities to provide AWS operations and deployment guidance and best practices throughout the lifecycle of a project.
- Efficient management of the Company’s IT Assets and Infrastructure.
- Strong knowledge of Linux, shell scripts, Apache, Mysql, Development Operations, Configuration management, monitoring, and TCP/IP protocols.
- Highly self-motivated and hard-working with excellent communication skills. Excellent English verbal and written proficiency is essential.
- Understanding of the basics of scalable SaaS systems /Cloud fundamentals.
- Understanding of security best practices.
- Ability to work with very little supervision and to work well in a team environment.
Salary 12500 inhand 15800 CTC, 2500 allowance,
Job Role
Visiting 15 existing customer and converting from off line to On line per day
Explaining online Best price store app features, benefits, offers
Generating 3 orders per day
Incentive per order and max cap for incentive is 5000/-
Can earn up to 17500 inhand
Location Surroundings of Store like Mehadipatnam, Rajendranagar. Attapur, Falaknuma, Chandrayangutta,old city
Qualification plus2/Graduation
Byke and smart phone must
Must have experience in sales /Marketing
Regards,
Ashok
- Manage sales operations in assigned district to achieve revenue goals.
- Supervise sales team members; the BSMs, on daily basis and provide guidance whenever needed.
- Identify skill gaps and conduct trainings to sales team.
- Work with team to implement new sales techniques to obtain profits.
- Assist in employee recruitment, promotion, retention and termination activities.
- Conduct employee performance evaluation and provide feedback for improvements.
- Contact potential customers and identify new business opportunities.
- Stay abreast with customer needs, market trends and competitors.
- Maintain clear and complete sales reports for management review.
- Build strong relationships with customers for business growth.
- Analyze sales performances and recommend improvements.
- Ensure that sales team follows company policies and procedures at all times.
- Develop promotional programs to increase sales and revenue.
- Plan and coordinate sales activities for assigned projects.








