- Implementing Digital Marketing activities to enhance the Fanztar community.
- Learning different Email/WhatsApp marketing activities and executing the same.
- Increasing Brand Awareness in your institution.
- Providing support assistance for customer queries.
About Fanztar
About
At Fanztar, we are creating a first-of-its-kind platform for the creator economy in India. We want to enable greater control for creators and fans. We are a group of passionate techies, designers, data scientists and marketers driven to build the utility layer for creator and fan communities.
Similar jobs
We are looking for a talented and competitive Marketing Manager, who has experience working with Healthcare or SaaS product companies, to join our team. This role will play a fundamental role in achieving our ambitious customer acquisition and revenue growth objectives.
You will be responsible for understanding the RCM and PI market landscape, identifying
your target audience, and collaborating to create the strategic positioning, targeting
plan, and channel mix to meet your category’s pipeline target.
The ideal candidate should be a self-starter who thrives in a fast-paced environment.
Responsibilities:
- Evaluating and optimizing marketing and pricing strategies.
- Analyzing market trends and preparing forecasts.
- Generating new business leads.
- Increasing brand awareness and market share.
- Coordinating marketing strategies with the sales, financial, public relations, and production departments.
- Developing and managing the marketing department's budget.
- Overseeing branding, advertising, and promotional campaigns.
- Managing the marketing department's staff.
- Preparing and presenting quarterly and annual reports to senior management.
- Promoting our brand at trade shows and major industry-related events.
- Keeping informed of marketing strategies and trends.
Requirements:
- Bachelor's degree in marketing, finance, business administration, or similar.
- A master's degree in a relevant field will be advantageous.
- At least 4 years' experience as a marketing manager.
- Extensive knowledge of marketing strategies, channels, and branding.
- Superb leadership, communication, and collaboration abilities.
- Exceptional analytical and problem-solving skills.
- Strong time management and organizational abilities.
We are a rapidly expanding global technology partner, that is looking for a highly skilled Senior (Python) Data Engineer to join their exceptional Technology and Development team. The role is in Kolkata. If you are passionate about demonstrating your expertise and thrive on collaborating with a group of talented engineers, then this role was made for you!
At the heart of technology innovation, our client specializes in delivering cutting-edge solutions to clients across a wide array of sectors. With a strategic focus on finance, banking, and corporate verticals, they have earned a stellar reputation for their commitment to excellence in every project they undertake.
We are searching for a senior engineer to strengthen their global projects team. They seek an experienced Senior Data Engineer with a strong background in building Extract, Transform, Load (ETL) processes and a deep understanding of AWS serverless cloud environments.
As a vital member of the data engineering team, you will play a critical role in designing, developing, and maintaining data pipelines that facilitate data ingestion, transformation, and storage for our organization.
Your expertise will contribute to the foundation of our data infrastructure, enabling data-driven decision-making and analytics.
Key Responsibilities:
- ETL Pipeline Development: Design, develop, and maintain ETL processes using Python, AWS Glue, or other serverless technologies to ingest data from various sources (databases, APIs, files), transform it into a usable format, and load it into data warehouses or data lakes.
- AWS Serverless Expertise: Leverage AWS services such as AWS Lambda, AWS Step Functions, AWS Glue, AWS S3, and AWS Redshift to build serverless data pipelines that are scalable, reliable, and cost-effective.
- Data Modeling: Collaborate with data scientists and analysts to understand data requirements and design appropriate data models, ensuring data is structured optimally for analytical purposes.
- Data Quality Assurance: Implement data validation and quality checks within ETL pipelines to ensure data accuracy, completeness, and consistency.
- Performance Optimization: Continuously optimize ETL processes for efficiency, performance, and scalability, monitoring and troubleshooting any bottlenecks or issues that may arise.
- Documentation: Maintain comprehensive documentation of ETL processes, data lineage, and system architecture to ensure knowledge sharing and compliance with best practices.
- Security and Compliance: Implement data security measures, encryption, and compliance standards (e.g., GDPR, HIPAA) as required for sensitive data handling.
- Monitoring and Logging: Set up monitoring, alerting, and logging systems to proactively identify and resolve data pipeline issues.
- Collaboration: Work closely with cross-functional teams, including data scientists, data analysts, software engineers, and business stakeholders, to understand data requirements and deliver solutions.
- Continuous Learning: Stay current with industry trends, emerging technologies, and best practices in data engineering and cloud computing and apply them to enhance existing processes.
Qualifications:
- Bachelor's or Master's degree in Computer Science, Data Science, or a related field.
- Proven experience as a Data Engineer with a focus on ETL pipeline development.
- Strong proficiency in Python programming.
- In-depth knowledge of AWS serverless technologies and services.
- Familiarity with data warehousing concepts and tools (e.g., Redshift, Snowflake).
- Experience with version control systems (e.g., Git).
- Strong SQL skills for data extraction and transformation.
- Excellent problem-solving and troubleshooting abilities.
- Ability to work independently and collaboratively in a team environment.
- Effective communication skills for articulating technical concepts to non-technical stakeholders.
- Certifications such as AWS Certified Data Analytics - Specialty or AWS Certified DevOps Engineer are a plus.
Preferred Experience:
- Knowledge of data orchestration and workflow management tools
- Familiarity with data visualization tools (e.g., Tableau, Power BI).
- Previous experience in industries with strict data compliance requirements (e.g., insurance, finance) is beneficial.
What You Can Expect:
- Innovation Abounds: Join a company that constantly pushes the boundaries of technology and encourages creative thinking. Your ideas and expertise will be valued and put to work in pioneering solutions.
- Collaborative Excellence: Be part of a team of engineers who are as passionate and skilled as you are. Together, you'll tackle challenging projects, learn from each other, and achieve remarkable results.
- Global Impact: Contribute to projects with a global reach and make a tangible difference. Your work will shape the future of technology in finance, banking, and corporate sectors.
They offer an exciting and professional environment with great career and growth opportunities. Their office is located in the heart of Salt Lake Sector V, offering a terrific workspace that's both accessible and inspiring. Their team members enjoy a professional work environment with regular team outings. Joining the team means becoming part of a vibrant and dynamic team where your skills will be valued, your creativity will be nurtured, and your contributions will make a difference. In this role, you can work alongside some of the brightest minds in the industry.
If you're ready to take your career to the next level and be part of a dynamic team that's driving innovation on a global scale, we want to hear from you.
Apply today for more information about this exciting opportunity.
Onsite Location: Kolkata, India (Salt Lake Sector V)
Roles & Responsibilities | : |
• Following up on new business opportunities and taking ownership of the sales cycle from the SQL stage till closure stage
• Navigating prospects through the POC and negotiating contracts within the expected time
• Manage multiple customers simultaneously at various stages of the Gumlet buying cycle
• At times a candidate is also expected to do end-to-end sales from lead generation to closure, depending on the business requirements
• Consistently work and learn about Gumlet and be an expert on the product value propositions
• Planning and preparing presentations to demonstrate Gumlet value proposition
• Develop and maintain territory plans which outline how sales targets will be met on an ongoing basis
• Provide forecasts on the best case and most likely sales volumes over relevant time periods
• Collaborate with other appropriate internal teams and represent the customers to solve their business requirements.
• Work with various internal stakeholders and achieve team targets.
|
---|---|---|
Responsibilities-
- Participate in the entire application lifecycle, focusing on coding and debugging
- Write clean code to develop functional web applications
- Troubleshoot and debug applications
- Collaborate with Front-end developers to integrate user-facing elements with server-side logic
- Gather and address technical and design requirements
- Provide training and support to internal teams
- Build reusable code and libraries for future use
- Liaise with developers, designers to identify new features
- Follow emerging technologies
Requirements-
- Spent at least 1 yr in a solid back end role
- Experience taking a problem to a product solution (module/product)
- Prior experience working with startups and understanding the fast-paced, dynamic nature of work
Must have skillsets:
- Javascript, NodeJS
- Client-server architecture
- OOP concepts, including design patterns
- Understanding of RDBMS (MySQL, Postgresql)
- Understanding of REST API
- Familiarity with frameworks such as ExpressJs
- Knowledge of TDD frameworks mocha, chai, jest, etc
- Knowledge of AWS (Lambda, RDS, EC2, SQS, S3, ECS, etc)
- Basic knowledge of Typescript, NextJs
- Prior experience in fin-tech
Excellent programming skills in C/C++ and Python
Need experience on cloud/Virtualization experience.
Strong working experience in developing application on Linux
Experience of working with multi-threading, IPC and socket programming is must.
Familiarity with OS concepts like memory management, scheduling etc. is desirable.
Familiarity with TCP/IP protocol stack is desirable
Knowledge of Linux networking stack including conceptual understanding of IPsec, iptables, conntrack, bridging, policy based routing etc. is desirable
Familiarity with container technology
Aware of Agile Methodologies, CI/CD methods
Experienced with all phases of project in development, testing and deployment and management of enterprise solutions.
Excellent verbal written communication skills.
Self-motivation and the ability to work under aggressive timeline is must.
Strong problem-solving skills and very good time management skills
We are looking for an experienced engineer with superb technical skills. You will primarily be responsible for architecting and building large scale data pipelines that delivers AI and Analytical solutions to our customers. The right candidate will enthusiastically take ownership in developing and managing a continuously improving, robust, scalable software solutions. The successful candidate will be curious, creative, ambitious, self motivated, flexible, and have a bias towards taking action. As part of the early engineering team, you will have a chance to make a measurable impact in future of Thinkdeeply as well as having a significant amount of responsibility.
Although your primary responsibilities will be around back-end work, we prize individuals who are willing to step in and contribute to other areas including automation, tooling, and management applications. Experience with or desire to learn Machine Learning a plus.
Experience
12+ Years
Location
Hyderabad
Skills
Bachelors/Masters/Phd in CS or equivalent industry experience
10+ years of industry experience in java related frameworks such as Spring and/or Typesafe
Experience with scripting languages. Python experience highly desirable. 5+ Industry experience in python
Experience with popular modern web frameworks such as Spring boot, Play framework, or Django
Demonstrated expertise of building and shipping cloud native applications
Experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka, ELK Stack, Fluentd
Experience in API development using Swagger
Strong expertise with containerization technologies including kubernetes, docker-compose
Experience with cloud platform services such as AWS, Azure or GCP.
Implementing automated testing platforms and unit tests
Proficient understanding of code versioning tools, such as Git
Familiarity with continuous integration, Jenkins
Responsibilities
Architect, Design and Implement Large scale data processing pipelines
Design and Implement APIs
Assist in dev ops operations
Identify performance bottlenecks and bugs, and devise solutions to these problems
Help maintain code quality, organization, and documentation
Communicate with stakeholders regarding various aspects of solution.
Mentor team members on best practices