1. Adhere to brand guidelines and complete projects according to the deadline.
2. Retouch and manipulate images.
3. Use graphic design software and work with a wide variety of media.
4. Collaborate with the creative director and graphic design manager to develop design concepts.
5. Receive feedback from the creative director and graphic design manager and make necessary changes.
6. Assemble final presentation material for printing as needed.
Other requirements:
1. Should know social media design.
2. Should have attention to detail (very important).
3. Must have knowledge of Photoshop, Figma/Adobe XD, Illustrator, Canva, After Effects, etc.
4. Must know how to use graphic design software and work with a wide variety of media.
5. Must have basic knowledge of layouts, typography, line composition, color, and other graphic design fundamentals.
6. Graphic design major preferred.
7. Experience with InDesign, Adobe Photoshop, and Illustrator.
8. Strong creative and analytical skills.
9. Compelling portfolio of graphic design work.

About Volumetree
About
Company video


Connect with the team
Similar jobs
6+ Years of development in Database technologies preferably DB2, Sybase or Oracle.
Python Scripting, Database/SQL, Databricks/Cloud experience
Strong understanding of RDBMS databases, JDBC, and database access technologies (Sybase, DB2 preferred but not required).
Strong SQL and database programming skills; including creating Views, Stored Procedures, Triggers, implementing Referential Integrity.
Experience with Database concepts, Data Analysis, Database Design, Data Warehousing concepts and working knowledge of Dimensional Modelling.
Excellent problem solving skills & good communication skills.
Willingness to take lead, own & drive things end to end independently
Desired Skills.
Knowledge of Financial domain.
Scripting languages like Perl, Python.
Prior knowledge of working in Agile model
Experience: 4+ years.
Location: Vadodara & Pune
Skills Set- Snowflake, Power Bi, ETL, SQL, Data Pipelines
What you'll be doing:
- Develop, implement, and manage scalable Snowflake data warehouse solutions using advanced features such as materialized views, task automation, and clustering.
- Design and build real-time data pipelines from Kafka and other sources into Snowflake using Kafka Connect, Snowpipe, or custom solutions for streaming data ingestion.
- Create and optimize ETL/ELT workflows using tools like DBT, Airflow, or cloud-native solutions to ensure efficient data processing and transformation.
- Tune query performance, warehouse sizing, and pipeline efficiency by utilizing Snowflakes Query Profiling, Resource Monitors, and other diagnostic tools.
- Work closely with architects, data analysts, and data scientists to translate complex business requirements into scalable technical solutions.
- Enforce data governance and security standards, including data masking, encryption, and RBAC, to meet organizational compliance requirements.
- Continuously monitor data pipelines, address performance bottlenecks, and troubleshoot issues using monitoring frameworks such as Prometheus, Grafana, or Snowflake-native tools.
- Provide technical leadership, guidance, and code reviews for junior engineers, ensuring best practices in Snowflake and Kafka development are followed.
- Research emerging tools, frameworks, and methodologies in data engineering and integrate relevant technologies into the data stack.
What you need:
Basic Skills:
- 3+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.
- Strong experience with Apache Kafka for stream processing and real-time data integration.
- Proficiency in SQL and ETL/ELT processes.
- Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.
- Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.
- Familiarity with tools like dbt, Airflow, or similar orchestration platforms.
- Knowledge of data governance, security, and compliance best practices.
- Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
- Ability to work in a collaborative team environment and communicate effectively with cross-functional teams
Responsibilities:
- Design, develop, and maintain Snowflake data warehouse solutions, leveraging advanced Snowflake features like clustering, partitioning, materialized views, and time travel to optimize performance, scalability, and data reliability.
- Architect and optimize ETL/ELT pipelines using tools such as Apache Airflow, DBT, or custom scripts, to ingest, transform, and load data into Snowflake from sources like Apache Kafka and other streaming/batch platforms.
- Work in collaboration with data architects, analysts, and data scientists to gather and translate complex business requirements into robust, scalable technical designs and implementations.
- Design and implement Apache Kafka-based real-time messaging systems to efficiently stream structured and semi-structured data into Snowflake, using Kafka Connect, KSQL, and Snow pipe for real-time ingestion.
- Monitor and resolve performance bottlenecks in queries, pipelines, and warehouse configurations using tools like Query Profile, Resource Monitors, and Task Performance Views.
- Implement automated data validation frameworks to ensure high-quality, reliable data throughout the ingestion and transformation lifecycle.
- Pipeline Monitoring and Optimization: Deploy and maintain pipeline monitoring solutions using Prometheus, Grafana, or cloud-native tools, ensuring efficient data flow, scalability, and cost-effective operations.
- Implement and enforce data governance policies, including role-based access control (RBAC), data masking, and auditing to meet compliance standards and safeguard sensitive information.
- Provide hands-on technical mentorship to junior data engineers, ensuring adherence to coding standards, design principles, and best practices in Snowflake, Kafka, and cloud data engineering.
- Stay current with advancements in Snowflake, Kafka, cloud services (AWS, Azure, GCP), and data engineering trends, and proactively apply new tools and methodologies to enhance the data platform.
Voiro is a media technology company founded in 2014 – a B2B SaaS offering that is
trusted by some of India’s largest media powerhouses. A revenue analytics platform
for content led businesses, Voiro helps media companies unlock a data driven
approach to accelerating revenue. Over the last 6 years, Voiro has become the
revenue management platform of choice in the Indian media space, earning its spot
in the core technology stack that has driven large live events year after year such as
the IPL, the Oscars, Bigg Boss and Big Billion Day.
Job Description
We’re looking for a marketing leader to build a top-quality marketing practice; one
that will help scale Voiro to the next level on the back of a strong product, a
customer list that includes the best of Indian media, and a tight, close-knit team.
Your Responsibilities
As Voiro’s Vice President of Marketing, you will:
● Be responsible for building and strengthening Voiro’s brand and positioning
in line with the company’s product, vision and strategy
● Own and execute a communication strategy, both external and internal
● Be responsible for delivering a pipeline of conversations to our sales team
● Build a robust, data-driven, measurement-focused marketing practice,
including but not limited to product marketing, content marketing, events
and digital marketing
● Grow a team of focused, capable, marketing specialists
Experience and background
We’re looking for someone with 8 or more years of experience in B2B marketing,
working across at least two or three of the following verticals: brand marketing,
digital marketing and lead generation, product marketing, events, content
production, research and measurement. An MBA is preferred but not essential
Our ideal hire will:.
● Have experience leading a team
● Be a complete technology and media enthusiast
● Be familiar with SaaS business constructs
● Be driven by data and measurement
● Have unquestionable ethics and be incredibly self-motivated
Job description
Daily Doc Technologies LLC https://dailydoc.io was conceived to innovate and bring cutting edge technology in medicine. Our mission is to make patient care more efficient, effortless and minimise medical errors. We focus on bringing useful IT solutions in medicine.
With advancements in technology, communication in healthcare can be made seamless and effortless. Lack of effective communication is one of the main causes of medical errors and unwanted outcomes. Daily Doc Healthcare App brings the technology in today's complex medical environment to give healthcare providers tools needed to have effortless, reliable and secure communication. Designed by doctors and nurses, we strive to make our platform better every day. Honesty and Integrity are our core values. We strive to innovate in healthcare to bring about positive meaningful changes in peoples lives.
Preferred Experience:
- 2+ years of experience working with mobile development.
- At least 1 to 2 years experience in Flutter Development.
- Should have knowledge about chat applications and technologies like Socket.io and Websockets are highly preferred.
- Deployed at least 3 complete apps with REST APIs linked.
- Cross-platform mobile app developers who have developed mobile apps with familiarity with Flutter
- Have experience with Flutter for both iOS and Android. Knowledge of native technologies is a bonus. • Familiarity with linking RESTful APIs.
- Knowledge of modern authorisation mechanisms, such as JSON Web Token.
- Ability to understand business requirements and translate them into technical requirements.
- Firebase Auth, Dart, Bloc, Cubit, MVC, Socket.io, Websockets, Providers, Network Callm Web Support, Offline Apps, Local Storage (or Sqflite), Google Maps API, Google Material Design are the preferred tech stack.
- Know how to deal with different screen sizes.
- Experience with version control such as Git and GitHub.
- Native android requirements like Kotlin, XML, Android Life Cycle, crash reporting tools and usage tracking tools are a bonus
Responsibilities
- Design and Build sophisticated and highly scalable apps using Flutter.
- Translate and build the designs into high quality responsive UI code.
- Write efficient queries for core Data.
- Resolve any problems existing in the system and suggest and add new features in the complete system.
- Follow the best practices while developing the app.
- Document the project and code efficiently.
- Manage the code and project on Git in order to keep in sync with other team members and managers.
- Knowledge of different state management libraries like BloC, GetX,
About the company
Founded in - 2018
Website https://dailydoc.io
Total Employes- 5
Job Types: Full-time, Permanent
Salary: ₹300,000.00 - ₹1,000,000.00 per year
Speak with the employer
96-99-56-97-85
At nFerence Labs, the "Google of Biomedicine", we are building the world's first massive-scale platform for pharmaco-biomedical computing. Our platform is premised on using AI/Deep Learning (on clinical text, medical images, and other signals) and massive high-performance computing to help pharma companies perform faster and more efficient drug discovery, and also help early diagnosis of several key diseases.
We collaborate heavily with premier medical institutions such as the Mayo Clinic and build systems to get deep medical insights from patient information including patient notes and lab information, medical images, ECGs, etc. We are a well-funded company and are looking to grow on all fronts.
We are hiring an experienced backend staff engineer for our Pramana team. Our Digital Pathology-as-a-service venture, Pramana is an in-line quality assurance software suite which for the first time in the industry, provides confidence to the labs for the accuracy and applicability of their digital assets while supporting industry-standard image formats.
Pramana’s whole slide imaging system is built upon the strong hardware expertise of former Spectral Insights (that nference acquired in 2020) and the strong software expertise of nference. Modular systems with Robotic automation have allowed Pramana to reduce the reliance on several technical staff. This will significantly reduce the total costs of ownership and is a more transparent model for Pramana’s clients.
Must have
- 5+ years experience with solid backend/engineering experience in C++/ Python
- Knowledge of data structures and an eye for architecture.
- Solid CS fundamentals, fluent in multi-threaded and asynchronous programming, and a strong inclination for architecting at scale.
- Excellent technical design, problem-solving, debugging, and communication skills.
- Rapid prototyping worked on distributed systems at scale.
- Basic knowledge of SQL as well as NoSQL databases.
- Proficient in Golang/ Python, design, and concurrency patterns.
Good to have
- Proficient in writing unit tests and profiling and benchmarking golang applications
- Experience in maintaining protobuf contract
- Experience in working with GRPC and grace
Benefits:
- Be a part of the “Google of biomedicine” as recognized by the Washington Post
- Work with some of the brilliant minds of the world solving exciting real-world problems through Artificial Intelligence, Machine Learning, analytics and insights through triangulating unstructured and structured information from the biomedical literature as well as from large-scale molecular and real-world datasets.
- Our benefits package includes the best of what leading organizations provide, such as stock options, paid time off, healthcare insurance, gym/broadband reimbursement.
• Two years of experience with Angular
• Angular 7+, Material, Typescript
• Expertise in chart libraries like D3.js, Chart.js, High charts
• Jasmine, Karma
• JavaScript (ES6), CSS3, HTML 5 APIs
• AJAX with JSON
• RESTful Web services
• Full stack development experience with backend Java, NodeJS is preferred.
• Familiar with object-oriented programming techniques and Agile development environment
• Experience in developing many different types of visualizations, including histogram analysis, time-series analysis methods
• Candidate should have an overall awareness of the performance and scalability of the screens created
• Working with Github
• Excellent communication skills, both verbal and written.
- Strong knowledge of PHP web frameworks like Laraveland working on other skills.
- Strong knowledge of MYSQL, RDBMS
- Familiar with of Apache Configuration
- Experience of object-oriented PHP programming
- Understanding of MVC design patterns
- User authentication and authorization between multiple systems, servers, and environments
- Strong knowledge of the common PHP or web server exploits and their solutions
- Creating database schemas that represent and support business processes.
About the company
It has set up a benchmark in the Medical and Health industry with its Digital revolutionary changes. It had a huge impact on Countries Education & the Health sector, as it has taken an effort to uplift & Developing Digital support in India's Medical Education with the sword of Technologies. Our products are being Designed & Developed to benefit the Medical Aspirant as well as its Country's Health Education system. With its continuous effort, many Medical Institutions have been successfully adopting a Digitalised advanced way of Teaching & Learning. Its MedWhiz LMS is very Effective & Essential for Medical Aspirants.
Experience : 3-4 Years
Responsibilities
● Develop new user-facing features .
● Build reusable code and libraries for future use .
● Ensure the technical feasibility of UI/UX designs .
● Optimize application for maximum speed and scalability.
● Assure that all user input is validated before submitting to the back-end .
● Collaborate with other team members and stakeholders
Skills
● Proficient understanding of web markup, including HTML5, CSS3 .
● Proficient understanding of client-side scripting and JavaScript frameworks.
● Proficient understanding of Bootstrap to make the page responsive.
● Good understanding of advanced JavaScript libraries like React JS and Angular 2+
framework.
● Good understanding of asynchronous request handling, partial page updates, and AJAX .
● Basic knowledge of image authoring tools, to be able to crop, resize, or perform small
adjustments on an image. Familiarity with tools such Photoshop is a plus.
● Proficient understanding of cross-browser compatibility issues and ways to work around
them.
● Proficient understanding of code versioning tools, such as Git.
● Good understanding of SEO principles and ensuring that application will adhere to them.
● Proficient understanding of REST API integration.
Regards
Team Merito
Key Responsibilities:
Design and build advanced applications for the Android platform
- Collaborate with cross-functional teams to define, design, and ship new features
- Work with outside data sources and APIs
- Unit-test code for robustness, including edge cases, usability, and general reliability
- Work on bug fixing and improving application performance
- Continuously discover, evaluate, and implement new technologies to maximize development efficiency
- Translate designs and wireframes into high quality code
- Understand business requirements and translate them into technical requirements
- Design, build, and maintain high performance, reusable, and reliable Java code
- Ensure the best possible performance, quality, and responsiveness of the application
- Identify and correct bottlenecks and fix bugs
- Help maintain code quality, organization, and automatization
- Continuously discover, evaluate, and implement new technologies to maximize development efficiency.
- Unit test code for robustness, including edge cases, usability, and general reliability using JUnit, Mocikto or Espresso.
- Lead and Mentor Android developers
Required Skills and Experience :
- 6+ year of proven software development experience and Android applications development
in Android Kotlin and react native based Hybrid app development
- BS/MS degree in Computer Science, Engineering or a related subject
- Experienced in Android Kotlin and react native based Hybrid app development
- Experience with Android SDK, different versions of Android, and how to deal with different screen sizes
- Experience working with remote data via REST and JSON
- Experienced in Client server programming (RESTful APIs) to connect Android applications to backend services
- Working knowledge of the general mobile landscape, architectures, trends, and emerging technologies
- Solid understanding of the full mobile development life cycle.
- Strong knowledge of Android UI design principles, patterns, and best practices
- Experience in Android Studio IDE and tools like Android Device Monitor, Logcat
- Experience with offline storage, threading, and performance tuning
- Knowledge of the open-source Android ecosystem and the libraries available for common tasks
- Proficient in using code versioning tools, such as Git, SVN
- Strong CS fundamentals and a good working knowledge of algorithms and data structures.
- Strong on OOPS and Java concepts
- Proficient in integration of third party libraries OkHttp, Retrofit, ButterKnife, Image caching libraries
- Working knowledge of RxJava, RxAndroid
- Expert in debugging, troubleshooting, memory optimization, performance and scalability of mobile app.
- Apk size and battery optimization
- Strong design/development experience working on at least 4 Mobile application apps from Scartch
- Has worked on MVP, MVVM design patterns for android applications
- Has experience on working with web views in Android and customizing them for different features
- Follows coding guidelines, Reviews code for peers and juniors.
- Understands and implement security guidelines
- Experience in localization
Intro
Our data and risk team is the core pillar of our business that harnesses alternative data sources to guide the decisions we make at Rely. The team designs, architects, as well as develop and maintain a scalable data platform the powers our machine learning models. Be part of a team that will help millions of consumers across Asia, to be effortlessly in control of their spending and make better decisions.
What will you do
The data engineer is focused on making data correct and accessible, and building scalable systems to access/process it. Another major responsibility is helping AI/ML Engineers write better code.
• Optimize and automate ingestion processes for a variety of data sources such as: click stream, transactional and many other sources.
- Create and maintain optimal data pipeline architecture and ETL processes
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Develop data pipeline and infrastructure to support real-time decisions
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS big data' technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs.
What will you need
• 2+ hands-on experience building and implementation of large scale production pipeline and Data Warehouse
• Experience dealing with large scale
- Proficiency in writing and debugging complex SQLs
- Experience working with AWS big data tools
• Ability to lead the project and implement best data practises and technology
Data Pipelining
- Strong command in building & optimizing data pipelines, architectures and data sets
- Strong command on relational SQL & noSQL databases including Postgres
- Data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
Big Data: Strong experience in big data tools & applications
- Tools: Hadoop, Spark, HDFS etc
- AWS cloud services: EC2, EMR, RDS, Redshift
- Stream-processing systems: Storm, Spark-Streaming, Flink etc.
- Message queuing: RabbitMQ, Spark etc
Software Development & Debugging
- Strong experience in object-oriented programming/object function scripting languages: Python, Java, C++, Scala, etc
- Strong hold on data structures & algorithms
What would be a bonus
- Prior experience working in a fast-growth Startup
- Prior experience in the payments, fraud, lending, advertising companies dealing with large scale data












