
1. Ideation, planning and execution of content calendars for Insta, FB, Twitter and LinkedIn (to work design teams)
2. Write copies and captions for platforms. And work with the design team to execute creatives.
3. Responsible for the growth of each platform
a. Followers
b. Engagement
c. Driving conversations
d. Traffic to the website from these social media channels
4. Use social media listening and analytics to understand areas of improvement and take necessary action on a weekly basis (also includes weekly reporting and suggestions). Provide insights to copy+design team in order to create innovative content
5. Assist in scheduling content on the platforms (as needed)
6. Ensure queries on social platforms are catered to always (to work with customer care/ORM)
7. Partnerships with brands/platforms to increase follower base and engagement for own platforms as needed
8. Plan and execute shoots for social media (still and reels)
COMMUNITY
1. Build an external community to support and enhance brand awareness metrics.
2. Work with internal and external teams to ensure that the content calendar is set and executed (for external community) in a timely manner.
PR
1. Work with PR team to build brand awareness and salience
2. Ensure monthly dissemination of press releases and consequent coverage
3. Work with copy team to draft answers, articles, press releases for brand campaigns, launches et

Similar jobs
About the Role:
We are seeking an experienced Data Engineer to lead and execute the migration of existing Databricks-based pipelines to Snowflake. The role requires strong expertise in PySpark/Spark, Snowflake, DBT, and Airflow with additional exposure to DevOps and CI/CD practices. The candidate will be responsible for re-architecting data
pipelines, ensuring data consistency, scalability, and performance in Snowflake, and enabling robust automation and monitoring across environments.
Key Responsibilities
Databricks to Snowflake Migration
· Analyze and understand existing pipelines and frameworks in Databricks (PySpark/Spark).
· Re-architect pipelines for execution in Snowflake using efficient SQL-based processing.
· Translate Databricks notebooks/jobs into Snowflake/DBT equivalents.
· Ensure a smooth transition with data consistency, performance, and scalability.
Snowflake
· Hands-on experience with storage integrations, staging (internal/external), Snowpipe, tables/views, COPY INTO, CREATE OR ALTER, and file formats.
· Implement RBAC (role-based access control), data governance, and performance tuning.
· Design and optimize SQL queries for large-scale data processing.
DBT (with Snowflake)
· Implement and manage models, macros, materializations, and SQL execution within DBT.
· Use DBT for modular development, version control, and multi-environment deployments.
Airflow (Orchestration)
· Design and manage DAGs to automate workflows and ensure reliability.
· Handle task dependencies, error recovery, monitoring, and integrations (Cosmos, Astronomer, Docker).
DevOps & CI/CD
· Develop and manage CI/CD pipelines for Snowflake and DBT using GitHub Actions, Azure DevOps, or equivalent.
· Manage version-controlled environments and ensure smooth promotion of changes across dev, test, and prod.
Monitoring & Observability
· Implement monitoring, alerting, and logging for data pipelines.
· Build self-healing or alert-driven mechanisms for critical/severe issue detection.
· Ensure system reliability and proactive issue resolution.
Required Skills & Qualifications
· 5+ years of experience in data engineering with focus on cloud data platforms.
· Strong expertise in:
· Databricks (PySpark/Spark) – analysis, transformations, dependencies.
· Snowflake – architecture, SQL, performance tuning, security (RBAC).
· DBT – modular model development, macros, deployments.
· Airflow – DAG design, orchestration, and error handling.
· Experience in CI/CD pipeline development (GitHub Actions, Azure DevOps).
· Solid understanding of data modeling, ETL/ELT processes, and best practices.
· Excellent problem-solving, communication, and stakeholder collaboration skills.
Good to Have
· Exposure to Docker/Kubernetes for orchestration.
· Knowledge of Azure Data Services (ADF, ADLS) or similar cloud tools.
· Experience with data governance, lineage, and metadata management.
Education
· Bachelor’s / Master’s degree in Computer Science, Engineering, or related field.
Job Description
Build and maintain bots on Azure platform. Integration with Active directory, WEB API based integration with external systems. Training and Integrate bots as per users’ requirements. Work in line with design guidelines, best practices and standards of bot deliverable. Creative approach to the conversation flow design, human aspects in the bot responses and sentiments
Qualifications
- a) 5 years of experience in software development with clear understanding of the project life cycle
b) Min 2-3 years of hands-on experience in Microsoft Azure Bot Framework , LUIS and other Cognitive services offered by Azure
c) Hands on experience with Machine Learning based chat bots
d) Experience with Azure bot services like Text Analytics etc.
e)Strong database skills and hands-on experience on databases like SQL Server/Oracle - f) Strong experience on Azure Active directory and adaptive cards integration in Chat bot.
- g) Strong experience designing and working with
with service-oriented architectures (SOA) and WebAPIs.
- h) A strong experience on Microsoft Azure, ASPNET / MVC and programming languages such as C# / VBNET
- i) Knowledge of Python and NodeJS is a plus
- j) Ability to design and optimize SQL Server 2008 stored procedures.
- k) Experience with JQuery, CSS3, HTML5 or similar technologies.
- l) Ability to adapt quickly to an existing, complex environment.
Job Role: Python Developer
Experience: 6 to 10 Years
Location: Bangalore
Responsibilities:
- Develop, test, and deploy Python applications.
- Build and maintain front-end components using JavaScript, jQuery, HTML, and CSS.
- Collaborate with teams to implement new features and optimize systems.
- Ensure code quality, security, and performance.
Required Skills:
- Strong Python experience (Django, Flask, or FastAPI).
- Proficiency in JavaScript, jQuery, HTML, CSS.
- Experience with relational databases (PostgreSQL, MySQL) and Git.
- Strong problem-solving and communication skills.
- Bonus: RESTful APIs, AWS/Azure, CI/CD pipelines.
Job Description
Mandatory Requirements
-
Experience in AWS Glue
-
Experience in Apache Parquet
-
Proficient in AWS S3 and data lake
-
Knowledge of Snowflake
-
Understanding of file-based ingestion best practices.
-
Scripting language - Python & pyspark
CORE RESPONSIBILITIES
-
Create and manage cloud resources in AWS
-
Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies
-
Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform
-
Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations
-
Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
-
Define process improvement opportunities to optimize data collection, insights and displays.
-
Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible
-
Identify and interpret trends and patterns from complex data sets
-
Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders.
-
Key participant in regular Scrum ceremonies with the agile teams
-
Proficient at developing queries, writing reports and presenting findings
-
Mentor junior members and bring best industry practices.
QUALIFICATIONS
-
5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales)
-
Strong background in math, statistics, computer science, data science or related discipline
-
Advanced knowledge one of language: Java, Scala, Python, C#
-
Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake
-
Proficient with
-
Data mining/programming tools (e.g. SAS, SQL, R, Python)
-
Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)
-
Data visualization (e.g. Tableau, Looker, MicroStrategy)
-
Comfortable learning about and deploying new technologies and tools.
-
Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines.
-
Good written and oral communication skills and ability to present results to non-technical audiences
-
Knowledge of business intelligence and analytical tools, technologies and techniques.
Familiarity and experience in the following is a plus:
-
AWS certification
-
Spark Streaming
-
Kafka Streaming / Kafka Connect
-
ELK Stack
-
Cassandra / MongoDB
-
CI/CD: Jenkins, GitLab, Jira, Confluence other related tools
As an IT Product Sales Person at UTS, you will play a pivotal role in driving the growth of our IT product sales within the Chennai region. You will be responsible for identifying and targeting potential clients, understanding their IT requirements, and presenting them with tailored solutions from our product portfolio. Your role will involve building and nurturing client relationships, collaborating with the sales team, and contributing to the achievement of revenue targets.
Job Responsibilities
· International Freight agents are responsible for matching authorized and reliable transportation carriers to the shippers and coordinating all of the shipping needs for many companies.
· Maintain current clients, generate leads and attract new prospects and develop a sales pipeline.
· Acquire new business through prospecting, cold calling etc. Contract with freight shipping carriers and negotiate the best rates and services for our customers.
Skills Required :
· Excellent Communication skills.
· Demonstrated ability to meet sales targets.
· Good understanding of the Freight Industry.
· Proficiency in office software, including Microsoft Word, Excel, and Outlook Express.
· Proficient negotiating skills.
· Excellent problem-solving abilities.
NOTE: Minimum 6 months of experience in freight brokerage is MANDATORY.
Location: Mohali , Delhi, Dehradun, Gurugram
This wellness platform is revolutionising primary health and wellness in India.
What you will do:
- Identifying broader trends and filling category gaps
- Supplier relationship building
- Handling merchandising operations
- Coordinating with marketing, supply chain, catalog team, finance/ commercial and other functions of the organization
- Catalog monitoring
- Discount monitoring
Desired Candidate Profile
What you need to have:- 4 - 10 years of experience
- Good analytical skills
- Willingness to learn, innovate, take initiatives, think beyond what is conventional and accepted and the ability to work in a flat organization
- Strong analytical aptitude in problem solving
- Proven track record of finding innovative solutions to business problem
- Previous work experience in E-Commerce companies
Currently, we are ongoing a Big Project for which we need an immediate Golang Developer for 6 Months on a contract basis with a strong understanding of quality design, problem-solving, contributing ideas on using the latest technologies etc. This role will involve collaboration with the business partners, product managers and other representatives. We’re looking for a highly skilled Senior Software Engineer to work closely with the product and the open-source community to build the Digirex vision. You’ll join a highly collaborative team working along with talented engineers focusing to be the best in Cloud-native ecosystem. You should also be willing to work with a start-up









