
Capture and edit school events, promotional content, and educational materials as a Video Production & Editing Intern. This role allows you to enhance your skills in videography and post-production.
Benefits:
● Hands-on experience in video editing and production
● Opportunity to create promotional content
● Portfolio-building for future career prospects
Key Responsibilities:
✔ Shoot and edit videos for school events and branding
✔ Develop promotional videos for digital marketing
✔ Work with the media team to enhance video content
Requirements:
● Experience with video editing software (Premiere Pro, Final Cut Pro, etc.)
● Creativity and storytelling ability

About Innovators International School
About
Company social profiles
Similar jobs

We are looking for a Senior Software Engineer with 5+ years of experience in modern C++ development, paired with strong hands-on skills in AWS, Node.js, data processing, and containerized service development. The ideal candidate will be responsible for building scalable systems, maintaining complex data pipelines, and modernizing applications through cloud-native approaches and automation.
This is a high-impact role where engineering depth meets platform evolution, ideal for someone who thrives in system-level thinking, data-driven applications, and full-stack delivery.
Key Responsibilities:
- Design, build, and maintain high-performance systems using modern C++
- Develop and deploy scalable backend services using Node.js and manage dependencies via NPM
- Architect and implement containerized services using Docker, with orchestration via Kubernetes or ECS
- Build, monitor, and maintain data ingestion, transformation, and enrichment pipelines
- Utilize AWS services (Lambda, EC2, S3, CloudWatch, Step Functions) to deliver reliable cloud-native solutions
- Implement and maintain modern CI/CD pipelines, ensuring seamless integration, testing, and delivery
- Participate in system design, peer code reviews, and performance tuning
Required Skills:
- 5+ years of software development experience, with strong command over modern C++
- Solid experience with Node.js, JavaScript, and NPM for backend development
- Deep understanding of cloud platforms (preferably AWS) and hands-on experience in deploying and managing applications in the cloud
- Proficient in building and scaling data processing workflows and working with structured/unstructured data
- Strong hands-on experience with Docker, container orchestration, and microservices architecture
- Working knowledge of CI/CD practices, Git, and build/release tools
- Strong problem-solving, debugging, and cross-functional collaboration skills
Preferred / Nice to Have:
- Exposure to data streaming frameworks (Kafka, Spark, etc.)
- Familiarity with monitoring and observability tools (e.g., Prometheus, Grafana, ELK stack)
- Background in performance profiling, secure coding, or legacy modernization
- Ability to work in agile environments and lead small technical initiatives
About Us: YMGrad is a leading study abroad consultancy firm dedicated to helping students achieve their dreams of higher education abroad. We provide comprehensive services, including application assistance, university shortlisting, and expert guidance throughout the admissions process.
Job Overview: We are seeking a skilled and motivated Content Writer to join our dynamic team. This role involves crafting compelling and personalized application materials, including Statements of Purpose (SOPs), Letters of Recommendation (LORs), resumes, and other college applications. Additionally, the writer will produce SEO-friendly blogs on study-abroad topics and provide consultancy to clients about the admissions process. The ideal candidate will be adaptable, collaborative, and enthusiastic about contributing to various aspects of our consultancy services.
Key Responsibilities:
1. Application Materials: Write and edit Statements of Purpose (SOPs), Letters of Recommendation (LORs), resumes, and other application documents tailored to individual client profiles.
2. Blog Writing: Develop and publish SEO-friendly blog content on study abroad topics, including tips, trends, and country-specific information.
3. Client Consultation: Provide personalized advice to clients regarding the admissions process, application strategies, and document requirements.
4. University Shortlisting: Assist clients in shortlisting suitable universities based on their academic profile, career goals, and preferences.
5. Collaborative Support: Work closely with other team members, including advisors and counselors, to ensure a seamless and integrated service for clients.
6. Quality Assurance: Ensure all written content is error-free, adheres to client requirements, and maintains high standards of professionalism.
7. Continuous Improvement: Stay updated on changes in the study abroad landscape and admissions requirements to provide current and accurate advice.
We are from Grups Automation , Providing solution in Automation Industry
Refer :www.grupsautomation.comJob Profile : Jr PLC programmerResponsibilities : Comletion of PLC , HMI , SCAda and VFD based projects Programming and Commissioning
Min Qualification : Graduate / BE / Btech / DiplomaExperience : Fresher of With 2 Years ExperienceGender : M/ FLocation : VasaJoining time : At earliestCTC : 1.5 to 4 Lacs/ Yr : As per performance InterviewDuty Time - 9.30 to 6Weekly off : Sunday
Please Apply - https://zrec.in/IGpwc?source=CareerSite
About Us
Infra360 Solutions is a services company specializing in Cloud, DevSecOps, Security, and Observability solutions. We help technology companies adapt DevOps culture in their organization by focusing on long-term DevOps roadmap. We focus on identifying technical and cultural issues in the journey of successfully implementing the DevOps practices in the organization and work with respective teams to fix issues to increase overall productivity. We also do training sessions for the developers and make them realize the importance of DevOps. We provide these services - DevOps, DevSecOps, FinOps, Cost Optimizations, CI/CD, Observability, Cloud Security, Containerization, Cloud Migration, Site Reliability, Performance Optimizations, SIEM and SecOps, Serverless automation, Well-Architected Review, MLOps, Governance, Risk & Compliance. We do assessments of technology architecture, security, governance, compliance, and DevOps maturity model for any technology company and help them optimize their cloud cost, streamline their technology architecture, and set up processes to improve the availability and reliability of their website and applications. We set up tools for monitoring, logging, and observability. We focus on bringing the DevOps culture to the organization to improve its efficiency and delivery.
Job Description
Job Title: DevOps Intern
Department: Technology
Location: Gurgaon
Work Mode: On-site
Working Hours: 10 AM - 7 PM
Terms: Permanent
Experience: 6+ Months
Education: B.Tech/MCA/BCA
Notice Period: Immediately
We are seeking a motivated and talented DevOps Intern to join our dynamic team. As a DevOps Intern, you will have the opportunity to work closely with our experienced DevOps engineers to support and improve our development and deployment processes. This is an excellent opportunity for someone looking to kick-start their career in DevOps and gain hands-on experience in a fast-paced, innovative environment.
Below is a detailed description of the roles and responsibilities, expectations for the role.
Tech Stack :
- Kubernetes: Deep understanding of Kubernetes clusters, container orchestration, and its architecture.
- Terraform: Extensive hands-on experience with Infrastructure as Code (IaC) using Terraform for managing cloud resources.
- ArgoCD: Experience in continuous deployment and using ArgoCD to maintain GitOps workflows.
- Helm: Expertise in Helm for managing Kubernetes applications.
- Cloud Platforms: Expertise in AWS, GCP or Azure will be an added advantage.
- Debugging and Troubleshooting: The DevOps Intern must be proficient in identifying and resolving complex issues in a distributed environment, ranging from networking issues to misconfigurations in infrastructure or application components.
Key Responsibilities:
- Assist in the development, deployment, and maintenance of cloud infrastructure.
- Collaborate with development and operations teams to automate and improve the CI/CD pipeline.
- Monitor system performance and troubleshoot issues to ensure high availability and reliability.
- Implement and maintain configuration management tools.
- Participate in code reviews and contribute to improving development practices.
- Assist in the creation of documentation for processes and procedures.
- Support the team in managing and maintaining development and production environments.
Qualifications:
- Currently pursuing or recently completed a degree in Computer Science, Information Technology, or a related field.
- Added advantage if prior internship experience and have completed DevOps course or certification.
- Basic understanding of cloud platforms (AWS, Azure, GCP) and cloud services.
- Familiarity with CI/CD tools (Jenkins, GitLab CI, CircleCI, etc.).
- Basic knowledge of scripting languages (Python, Bash, etc.).
- Understanding of version control systems (Git, SVN).
- Strong problem-solving skills and attention to detail.
- Ability to work collaboratively in a team environment.
- Eagerness to learn and adapt to new technologies and processes.
Preferred Qualifications:
- Experience with containerization technologies (Docker, Kubernetes).
- Knowledge of infrastructure as code (Terraform, Ansible).
- Familiarity with monitoring and logging tools (Prometheus, Grafana, ELK stack).
- Previous internship or project experience in DevOps or related fields.
What We Offer:
- Hands-on experience with modern DevOps tools and practices.
- Mentorship and guidance from experienced professionals.
- Opportunity to work on real-world projects and make a tangible impact.
- Collaborative and supportive work environment.
- Potential for future full-time opportunities based on performance.
Internship Details:
- Duration: 6 months of internship which can be converted to full-time employment based on performance
- Location: Gurgaon
- Mode: In Office

Role Overview:
We are seeking a highly skilled and motivated Data Scientist to join our growing team. The ideal candidate will be responsible for developing and deploying machine learning models from scratch to production level, focusing on building robust data-driven products. You will work closely with software engineers, product managers, and other stakeholders to ensure our AI-driven solutions meet the needs of our users and align with the company's strategic goals.
Key Responsibilities:
- Develop, implement, and optimize machine learning models and algorithms to support product development.
- Work on the end-to-end lifecycle of data science projects, including data collection, preprocessing, model training, evaluation, and deployment.
- Collaborate with cross-functional teams to define data requirements and product taxonomy.
- Design and build scalable data pipelines and systems to support real-time data processing and analysis.
- Ensure the accuracy and quality of data used for modeling and analytics.
- Monitor and evaluate the performance of deployed models, making necessary adjustments to maintain optimal results.
- Implement best practices for data governance, privacy, and security.
- Document processes, methodologies, and technical solutions to maintain transparency and reproducibility.
Qualifications:
- Bachelor's or Master's degree in Data Science, Computer Science, Engineering, or a related field.
- 5+ years of experience in data science, machine learning, or a related field, with a track record of developing and deploying products from scratch to production.
- Strong programming skills in Python and experience with data analysis and machine learning libraries (e.g., Pandas, NumPy, TensorFlow, PyTorch).
- Experience with cloud platforms (e.g., AWS, GCP, Azure) and containerization technologies (e.g., Docker).
- Proficiency in building and optimizing data pipelines, ETL processes, and data storage solutions.
- Hands-on experience with data visualization tools and techniques.
- Strong understanding of statistics, data analysis, and machine learning concepts.
- Excellent problem-solving skills and attention to detail.
- Ability to work collaboratively in a fast-paced, dynamic environment.
Preferred Qualifications:
- Knowledge of microservices architecture and RESTful APIs.
- Familiarity with Agile development methodologies.
- Experience in building taxonomy for data products.
- Strong communication skills and the ability to explain complex technical concepts to non-technical stakeholders.


Position Overview: We are seeking a talented Data Engineer with expertise in Power BI to join our team. The ideal candidate will be responsible for designing and implementing data pipelines, as well as developing insightful visualizations and reports using Power BI. Additionally, the candidate should have strong skills in Python, data analytics, PySpark, and Databricks. This role requires a blend of technical expertise, analytical thinking, and effective communication skills.
Key Responsibilities:
- Design, develop, and maintain data pipelines and architectures using PySpark and Databricks.
- Implement ETL processes to extract, transform, and load data from various sources into data warehouses or data lakes.
- Collaborate with data analysts and business stakeholders to understand data requirements and translate them into actionable insights.
- Develop interactive dashboards, reports, and visualizations using Power BI to communicate key metrics and trends.
- Optimize and tune data pipelines for performance, scalability, and reliability.
- Monitor and troubleshoot data infrastructure to ensure data quality, integrity, and availability.
- Implement security measures and best practices to protect sensitive data.
- Stay updated with emerging technologies and best practices in data engineering and data visualization.
- Document processes, workflows, and configurations to maintain a comprehensive knowledge base.
Requirements:
- Bachelor’s degree in Computer Science, Engineering, or related field. (Master’s degree preferred)
- Proven experience as a Data Engineer with expertise in Power BI, Python, PySpark, and Databricks.
- Strong proficiency in Power BI, including data modeling, DAX calculations, and creating interactive reports and dashboards.
- Solid understanding of data analytics concepts and techniques.
- Experience working with Big Data technologies such as Hadoop, Spark, or Kafka.
- Proficiency in programming languages such as Python and SQL.
- Hands-on experience with cloud platforms like AWS, Azure, or Google Cloud.
- Excellent analytical and problem-solving skills with attention to detail.
- Strong communication and collaboration skills to work effectively with cross-functional teams.
- Ability to work independently and manage multiple tasks simultaneously in a fast-paced environment.
Preferred Qualifications:
- Advanced degree in Computer Science, Engineering, or related field.
- Certifications in Power BI or related technologies.
- Experience with data visualization tools other than Power BI (e.g., Tableau, QlikView).
- Knowledge of machine learning concepts and frameworks.


§ Experience of developing web based enterprise wide applications in .NET4.5 (C#, ASP.NET), WCF, WPF, AJAX etc
§ Understanding of enterprise application development and deployment.
§ Good understanding OOPS and Design Patterns and experience of tools like Enterprise Architect.
§ Good knowledge of RDBMS design, programming and DBA concepts and experience of working in SQL Server and Oracle databases.
§ Good Design, Coding and Testing skills.
§ Knowledge and exposure to Service Oriented Architecture, Enterprise Service Bus and application Integration (using middleware).
§ Experience of working in electricity utility domain – preferably in EA/AMR/AMI/EMS/IT System Integration/ERP (for Utilities) projects
§ Mobility based software development, GPRS, GSM Knowledge of BI tools implementation.
§ Experience in mobility based applications will be an added advantage.


Job Description
This requirement is to service our client which is a leading big data technology company that measures what viewers consume across platforms to enable marketers make better advertising decisions. We are seeking a Senior Data Operations Analyst to mine large-scale datasets for our client. Their work will have a direct impact on driving business strategies for prominent industry leaders. Self-motivation and strong communication skills are both must-haves. Ability to work in a fast-paced work environment is desired.
Problems being solved by our client:
Measure consumer usage of devices linked to the internet and home networks including computers, mobile phones, tablets, streaming sticks, smart TVs, thermostats and other appliances. There are more screens and other connected devices in homes than ever before, yet there have been major gaps in understanding how consumers interact with this technology. Our client uses a measurement technology to unravel dynamics of consumers’ interactions with multiple devices.
Duties and responsibilities:
- The successful candidate will contribute to the development of novel audience measurement and demographic inference solutions.
- Develop, implement, and support statistical or machine learning methodologies and processes.
- Build, test new features and concepts and integrate into production process
- Participate in ongoing research and evaluation of new technologies
- Exercise your experience in the development lifecycle through analysis, design, development, testing and deployment of this system
- Collaborate with teams in Software Engineering, Operations, and Product Management to deliver timely and quality data. You will be the knowledge expert, delivering quality data to our clients
Qualifications:
- 3-5 years relevant work experience in areas as outlined below
- Experience in extracting data using SQL from large databases
- Experience in writing complex ETL processes and frameworks for analytics and data management. Must have experience in working on ETL tools.
- Master’s degree or PhD in Statistics, Data Science, Economics, Operations Research, Computer Science, or a similar degree with a focus on statistical methods. A Bachelor’s degree in the same fields with significant, demonstrated professional research experience will also be considered.
- Programming experience in scientific computing language (R, Python, Julia) and the ability to interact with relational data (SQL, Apache Pig, SparkSQL). General purpose programming (Python, Scala, Java) and familiarity with Hadoop is a plus.
- Excellent verbal and written communication skills.
- Experience with TV or digital audience measurement or market research data is a plus.
- Familiarity with systems analysis or systems thinking is a plus.
- Must be comfortable with analyzing complex, high-volume and high-dimension data from varying sources
- Excellent verbal, written and computer communication skills
- Ability to engage with Senior Leaders across all functional departments
- Ability to take on new responsibilities and adapt to changes
Should have 2-4 Years of experience in Product design

