11+ UV Jobs in India
Apply to 11+ UV Jobs on CutShort.io. Find your next job, effortlessly. Browse UV Jobs and apply today!

Job Summary:
We are seeking 3D Modelling Interns to join our asset production team. The successful candidates will work closely with our asset production team to create 3D models of retail products. The ideal candidate should have a strong background in 3D modelling software, a keen eye for detail, and the ability to work independently.
Key Responsibilities:
- Collaborate with production team members to create accurate 3D models of retail products.
- Use 3D modelling software, preferably Blender 3D to create, modify, and refine 3D models.
- Ensure that the 3D models are accurate, detailed, and realistic.
- Test and evaluate 3D models to ensure they meet design and asset production requirements
- Work as part of a production team to ensure that projects are completed on time.
- Communicate progress and issues effectively with the asset production teams.
- Participate in reviews and provide feedback to improve the asset production process.
- Continuously learn and improve 3D modelling skills and techniques.
Requirements:
- Currently enrolled in a degree / certificate program in a relevant field such as Industrial Design, Product Design, or 3D Modelling.
- Strong knowledge of 3D modelling in Blender or similar modelling software. If not proficient in Blender, training will be provided.
- UV mapping and texturing skills are highly desirable.
- Knowledge of PBR texture workflow is preferred but not required.
- Excellent attention to detail and ability to work independently and as part of a team.
- Strong communication and interpersonal skills.
- Ability to meet tight deadlines.
These internships are paid and are a great opportunity for anyone looking to gain hands-on experience in working on the future of retail product visualisation in various industries. The successful candidates will work with a team of experienced professionals, and gain valuable insights into the asset production process.
Eridium Digital is a Digital Marketing company, we help brands - see, shape, and act on opportunities. Driven by insights and data we decipher the consumer intent, delivering sophisticated Integrated brand presence and targeted campaigns across Search, Social, AI driven Display in sync with Marketing Technologies. We help brands to enhance their value across reputation, reach, and interest.
As a Content Writer for Eridium Digital, you will create various forms of content for our clients ranging from IT, healthcare, finance to NGOs, and many more. Follow editorial guidelines, play with words, and let your content reflect the right value of products and services. Researching before writing is a key procedure to follow.
Responsibilities:
- Create metadata for clients.
- Experience in reutilizing the content and creating new content for Technical and B2B Content Writing.
- Conduct in-depth research on industry-related topics to develop original content.
- Create Q and A with keywords for Quora, Reddit, Web 2.0, etc.,
- Develop blogs, articles, website content, etc. with suggested keywords.
- Create compelling pieces with information and keywords to enhance search engine visibility.
- Create Ad campaign content for clients.
- Suggest blog topics based on keywords for clients.
Requirements:
- Minimum experience of 2-3 years in Content Writing.
- Prior experience with B2B, tech companies, and content marketing.
- Prior content writing experience.
- Working knowledge of SEO and Technical content writing.
- The ability to work in a fast-paced environment.
- The ability to handle multiple projects concurrently.
- Effective communication skills.
Please fill out the form below to apply:
- Developing and maintaining all server-side network components.
- Ensuring optimal performance of the central database and responsiveness to front-end requests.
- Collaborating with front-end developers on the integration of elements.
- Designing customer-facing UI and back-end services for various business processes.
- Developing high-performance applications by writing testable, reusable, and efficient code.
- Implementing effective security protocols, data protection measures, and storage solutions.
- Running diagnostic tests, repairing defects, and providing technical support.
- Documenting Node.js processes, including database schemas, as well as preparing reports.
- Recommending and implementing improvements to processes and technologies.
- Keeping informed of advancements in the field of Node.js development.
Node.js Developer Requirements:
- Bachelor's degree in computer science, information science, or similar.
- At least two years' experience as a Node.js developer.
- Extensive knowledge of JavaScript, web stacks, libraries, and frameworks.


About KAFQA
At Kafqa, we are building the next generation performing arts platform. Our mission is to transform how India learns, performs & watches performing arts. Our launch services consist of technology-enabled dance classes in our proprietary studios, production facilities, and social media broadcasting & competitions.
Founder & Team
The founder is Shariq Plasticwala. He is a graduate of IIT Bombay & Stanford GSB. He was part of the founding team of Amazon India where he played a key role for over 8 years. Among his roles at Amazon, he was the CEO of Amazon’s first joint venture in India and a Board Member of Amazon’s payments business. The other members of the founding team consisted of senior executives from Shiamak Davar & Byju’s.
Background
- At least 6+ years experience in back-end development in a fast-paced environment, working in Python & Django Framework
- Should Have experience leading a team.
- Deep understanding of technologies used in web products, including Rest APIs
- Sound understanding of SQL/NoSQL databases such as PostgreSQL and MongoDB.
- Deep familiarity with UNIX, major cloud platforms (AWS, Azure), DevOps.
- Understanding of databases, and related tools, paradigms.
Opportunity
Here are some of the things you might have to do on any given day:
- Developing APIs and endpoints for deployments of our product.
- Infrastructure Development such as building databases, creating and maintaining automated jobs.
- Build out the back-end to deploy and scale our product.
- Act as a point of contact and build influential candidate relationships during the selection process.
- The candidate will be responsible to handle the entire cycle of recruitment and selection under the guidance of Head - HR inclusive of the following but not limiting to:.
- Source/ Screen candidates from job portals and other forums.
- Must have Good Communication Skills
- Freshers may apply
* Understanding of building architecture from product requirements.
* Experience of leading teams of developers to maximise performance.
* Knowledge of Multithreading, Thread Pooling, Background Jobs and Schedule Jobs with supporting tools and libraries.
* Working on Microservices based architecture using spring cloud, distributed application pattern and multiple data source management in the application.
* Working on Linux and Windows based OS and their command line tools.
* Working with unit testing frameworks.
* Object-Oriented development and Metaprogramming.
* Experience in working with the SQL Databases (MySQL or PostgreSQL) and No SQL. Databases (Cassandra or MongoDB).
* Knowledge of server configuration management and deployment techniques. Good to have experience in handling DevOps tools like Jenkins and containerization using Docker.
* Experience in working with different AWS cloud services.
• Design and code the excellent workflow, features, or modules in the Simplify360 suite.
• Tackle challenging engineering and product problems, create solutions to customer's problems.
• Create new ideas with our design teams to continually iterate on the experience.
• Work cross-functionally to evaluate the relative importance of and need for product initiatives.
• Take ownership of modules from design to implementation and deployment.
Requirements
• Great software design and development skills. Deep knowledge of design, coding, and implementation.
• Ability to work both independently and in cooperation with others.
• A sense of urgency and ownership over the product.
• Comfortable with full-stack projects and able to build a minimum working and prototypes quickly.
• Fluency with both front-end (e.g., html/css/javascript, bootstrap, jquery) and back-end technologies used, primarily Core Java, J2EE, Struts, Hibernate.
• Knowledge of Solr, Kafka would be an added advantage.
• Knowledge of Big Data solutions like Hadoop, HBase would be an added advantage.
• Great attitude towards work and people.


About the Company:
It is a Data as a Service company that helps businesses harness the power of data. Our technology fuels some of the most interesting big data projects of the word. We are a small bunch of people working towards shaping the imminent data-driven future by solving some of its fundamental and toughest challenges.
Role: We are looking for an experienced team lead to drive data acquisition projects end to end. In this role, you will be working in the web scraping team with data engineers, helping them solve complex web problems and mentor them along the way. You’ll be adept at delivering large-scale web crawling projects, breaking down barriers for your team and planning at a higher level, and getting into the detail to make things happen when needed.
Responsibilities
- Interface with clients and sales team to translate functional requirements into technical requirements
- Plan and estimate tasks with your team, in collaboration with the delivery managers
- Engineer complex data acquisition projects
- Guide and mentor your team of engineers
- Anticipate issues that might arise and proactively consider those into design
- Perform code reviews and suggest design changes
Prerequisites
- Between 2-3 years of relevant experience
- Fluent programming skills and well-versed with scripting languages like Python or Ruby
- Solid foundation in data structures and algorithms
- Excellent tech troubleshooting skills
- Good understanding of web data landscape
- Prior exposure to DOM, XPATH and hands on experience with selenium/automated testing is a plus
Responsible for planning, connecting, designing, scheduling, and deploying data warehouse systems. Develops, monitors, and maintains ETL processes, reporting applications, and data warehouse design. |
Role and Responsibility |
· Plan, create, coordinate, and deploy data warehouses. · Design end user interface. · Create best practices for data loading and extraction. · Develop data architecture, data modeling, and ETFL mapping solutions within structured data warehouse environment. · Develop reporting applications and data warehouse consistency. · Facilitate requirements gathering using expert listening skills and develop unique simple solutions to meet the immediate and long-term needs of business customers. · Supervise design throughout implementation process. · Design and build cubes while performing custom scripts. · Develop and implement ETL routines according to the DWH design and architecture. · Support the development and validation required through the lifecycle of the DWH and Business Intelligence systems, maintain user connectivity, and provide adequate security for data warehouse. · Monitor the DWH and BI systems performance and integrity provide corrective and preventative maintenance as required. · Manage multiple projects at once. |
DESIRABLE SKILL SET |
· Experience with technologies such as MySQL, MongoDB, SQL Server 2008, as well as with newer ones like SSIS and stored procedures · Exceptional experience developing codes, testing for quality assurance, administering RDBMS, and monitoring of database · High proficiency in dimensional modeling techniques and their applications · Strong analytical, consultative, and communication skills; as well as the ability to make good judgment and work with both technical and business personnel · Several years working experience with Tableau, MicroStrategy, Information Builders, and other reporting and analytical tools · Working knowledge of SAS and R code used in data processing and modeling tasks · Strong experience with Hadoop, Impala, Pig, Hive, YARN, and other “big data” technologies such as AWS Redshift or Google Big Data
|

