
Job Title: Business Development Intern (0–1 Year Experience)
Location: Gurgaon (On-site)
About the Role:
We are looking for a motivated Business Development Intern to join our team. This role is ideal for freshers who have strong communication skills and basic knowledge of business development or IT sales.
Key Requirements:
- 0–1 years of experience (Freshers welcome)
- Excellent communication skills (verbal and written)
- Basic knowledge of IT sales, lead generation, or B2B business development
Responsibilities:
- Assist in lead generation and client outreach through calls, emails, and LinkedIn
- Support the team in communicating with potential clients
- Learn and contribute to the sales process
Education:
- B.Tech or MBA preferred, but candidates with strong communication skills are encouraged to apply

About FiftyFive Technologies Pvt Ltd
About
Connect with the team
Company social profiles
Similar jobs
The Sr AWS/Azure/GCP Databricks Data Engineer at Koantek will use comprehensive
modern data engineering techniques and methods with Advanced Analytics to support
business decisions for our clients. Your goal is to support the use of data-driven insights
to help our clients achieve business outcomes and objectives. You can collect, aggregate, and analyze structured/unstructured data from multiple internal and external sources and
patterns, insights, and trends to decision-makers. You will help design and build data
pipelines, data streams, reporting tools, information dashboards, data service APIs, data
generators, and other end-user information portals and insight tools. You will be a critical
part of the data supply chain, ensuring that stakeholders can access and manipulate data
for routine and ad hoc analysis to drive business outcomes using Advanced Analytics. You are expected to function as a productive member of a team, working and
communicating proactively with engineering peers, technical lead, project managers, product owners, and resource managers. Requirements:
Strong experience as an AWS/Azure/GCP Data Engineer and must have
AWS/Azure/GCP Databricks experience. Expert proficiency in Spark Scala, Python, and spark
Must have data migration experience from on-prem to cloud
Hands-on experience in Kinesis to process & analyze Stream Data, Event/IoT Hubs, and Cosmos
In depth understanding of Azure/AWS/GCP cloud and Data lake and Analytics
solutions on Azure. Expert level hands-on development Design and Develop applications on Databricks. Extensive hands-on experience implementing data migration and data processing
using AWS/Azure/GCP services
In depth understanding of Spark Architecture including Spark Streaming, Spark Core, Spark SQL, Data Frames, RDD caching, Spark MLib
Hands-on experience with the Technology stack available in the industry for data
management, data ingestion, capture, processing, and curation: Kafka, StreamSets, Attunity, GoldenGate, Map Reduce, Hadoop, Hive, Hbase, Cassandra, Spark, Flume, Hive, Impala, etc
Hands-on knowledge of data frameworks, data lakes and open-source projects such
asApache Spark, MLflow, and Delta Lake
Good working knowledge of code versioning tools [such as Git, Bitbucket or SVN]
Hands-on experience in using Spark SQL with various data sources like JSON, Parquet and Key Value Pair
Experience preparing data for Data Science and Machine Learning with exposure to- model selection, model lifecycle, hyperparameter tuning, model serving, deep
learning, etc
Demonstrated experience preparing data, automating and building data pipelines for
AI Use Cases (text, voice, image, IoT data etc. ). Good to have programming language experience with. NET or Spark/Scala
Experience in creating tables, partitioning, bucketing, loading and aggregating data
using Spark Scala, Spark SQL/PySpark
Knowledge of AWS/Azure/GCP DevOps processes like CI/CD as well as Agile tools
and processes including Git, Jenkins, Jira, and Confluence
Working experience with Visual Studio, PowerShell Scripting, and ARM templates. Able to build ingestion to ADLS and enable BI layer for Analytics
Strong understanding of Data Modeling and defining conceptual logical and physical
data models. Big Data/analytics/information analysis/database management in the cloud
IoT/event-driven/microservices in the cloud- Experience with private and public cloud
architectures, pros/cons, and migration considerations. Ability to remain up to date with industry standards and technological advancements
that will enhance data quality and reliability to advance strategic initiatives
Working knowledge of RESTful APIs, OAuth2 authorization framework and security
best practices for API Gateways
Guide customers in transforming big data projects, including development and
deployment of big data and AI applications
Guide customers on Data engineering best practices, provide proof of concept, architect solutions and collaborate when needed
2+ years of hands-on experience designing and implementing multi-tenant solutions
using AWS/Azure/GCP Databricks for data governance, data pipelines for near real-
time data warehouse, and machine learning solutions. Over all 5+ years' experience in a software development, data engineering, or data
analytics field using Python, PySpark, Scala, Spark, Java, or equivalent technologies. hands-on expertise in Apache SparkTM (Scala or Python)
3+ years of experience working in query tuning, performance tuning, troubleshooting, and debugging Spark and other big data solutions. Bachelor's or Master's degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of study or equivalent work experience
Ability to manage competing priorities in a fast-paced environment
Ability to resolve issues
Basic experience with or knowledge of agile methodologies
AWS Certified: Solutions Architect Professional
Databricks Certified Associate Developer for Apache Spark
Microsoft Certified: Azure Data Engineer Associate
GCP Certified: Professional Google Cloud Certified
Your Responsibilities:
- Own the backend stack – Python based, that powers our product
- Collaborate with Data Scientists, Backend Developers(Node.js), Front-end developers, DevOps to design and implement new features
- Build and maintain several Backend Jobs and REST’ful Services which will be used internally in a Macroservices/Distributed services environment.
- Deploy and monitor the Jobs and endpoints ensuring availability and scalability(ability to handle 100X data processing load)
- Work on full project lifecycle starting from requirements gathering/understanding the problem to deploying and maintaining the project.
Skills that you bring Along:
- A minimum 8+ years of extensive work experience with Python and related frameworks – particularly Flask.
- Extensive experience in designing and scheduling backend Python jobs
- Hands on working in different file formats like Json, Parquet, csv etc. coming from Data Science side.
- Extensive experience with databases such as Postgres and Mongo.
- Extensive experience in Cloud Infrastructure (AWS based) – e.g. AWS API Gateway, Lambda Functions etc.
- Experience with cache like Redis and/or Memory cache
- Good experience in Microservices/Macroservices or Event driven Architectures
- Good experience with design patterns
- Experience in writing advanced SQL-queries, good knowledge of PL/SQL
- Good understanding of Software Design Principles and domain-driven design
- Good experience with Continuous Delivery and Containerization(Docker)
- Good experience in designing and maintaining REST’ful API endpoints
- Ideally maintaining infrastructure-as-code using Terraform
- Ideally experience in parallel data processing and building end-to-end Data Pipelines using tools such as Airflow/Prefect and Spark/Dask
- Excellent communication skills and the ability to explain complex topics in a simple manner
- To implement the manpower plan in coordination with the Site Head and Department Heads and do the placements and recruitment in line with the recruitment procedures, for all categories and ensure that the manpower strength is always maintained within the budged level.
- To handle the administration of all human resources / personnel systems including performance appraisal systems in respect of all employees of the Site.
- To prepare training plan in consultation with Operations Head and in coordination with all and Department Heads and ensure implementation.
- To ensure that the training records including identified training needs of all employees are maintained and to assist in preparation of Training plan.
- To maintain the Training Hall including the equipments / aids etc.,
- To Co – ordinate and ensure that the evaluation / assessment of all trainees / Probationers of all category and to implement the recommendations.
- To conduct the Annual Performance Appraisal exercise with a view to maintain and boost the morale of the employees.
- To create a culture to induce personal as well as organization development.
- To ensure that the requirements of health & safety measures for the Site are adhered to and complied with.
- To ensure that the dossier/personnel data of all employees are maintained up-to date and provide the data as and when required.
- Manage sales operations in assigned district to achieve revenue goals.
- Supervise sales team members; the BSMs, on daily basis and provide guidance whenever needed.
- Identify skill gaps and conduct trainings to sales team.
- Work with team to implement new sales techniques to obtain profits.
- Assist in employee recruitment, promotion, retention and termination activities.
- Conduct employee performance evaluation and provide feedback for improvements.
- Contact potential customers and identify new business opportunities.
- Stay abreast with customer needs, market trends and competitors.
- Maintain clear and complete sales reports for management review.
- Build strong relationships with customers for business growth.
- Analyze sales performances and recommend improvements.
- Ensure that sales team follows company policies and procedures at all times.
- Develop promotional programs to increase sales and revenue.
- Plan and coordinate sales activities for assigned projects.
- Provide outstanding services and ensure customer satisfaction.









