We are looking for a creative individual to join our team as a 3D artist.
The responsibilities of the 3D artist include creating still and moving images using computers,
creating 3D models of products, and managing multiple projects while adhering to deadlines.
Ultimately, a top-notch 3D artist is creative and artistic with a strong working knowledge of color,
texture, and light as well as industry-standard software. Had Idea about the Metaverse.
Responsiblities:
* Using 3D modeling, texture, mapping, and other techniques to create graphics, visual effects, and animations.
* Collaborating with Animators and other artists and attending meetings to discuss ongoing projects.
* Understanding the project requirements and conceptualizing creative ideas.
* Creating 3D sculpts and assets to meet artistic standards.
* Troubleshooting any problems that arise during work on a project.
* Meeting with clients, Designers, and Directors to discuss and review projects and deadlines.
Software Requirments:
* Maya
* Blender
* Clo
* Mixamo
* 3ds Max

About Sankshit PVT LTD
About
Similar jobs

Role Overview:
We are seeking a talented and experienced Data Architect with strong data visualization capabilities to join our dynamic team in Mumbai. As a Data Architect, you will be responsible for designing, building, and managing our data infrastructure, ensuring its reliability, scalability, and performance. You will also play a crucial role in transforming complex data into insightful visualizations that drive business decisions. This role requires a deep understanding of data modeling, database technologies (particularly Oracle Cloud), data warehousing principles, and proficiency in data manipulation and visualization tools, including Python and SQL.
Responsibilities:
- Design and implement robust and scalable data architectures, including data warehouses, data lakes, and operational data stores, primarily leveraging Oracle Cloud services.
- Develop and maintain data models (conceptual, logical, and physical) that align with business requirements and ensure data integrity and consistency.
- Define data governance policies and procedures to ensure data quality, security, and compliance.
- Collaborate with data engineers to build and optimize ETL/ELT pipelines for efficient data ingestion, transformation, and loading.
- Develop and execute data migration strategies to Oracle Cloud.
- Utilize strong SQL skills to query, manipulate, and analyze large datasets from various sources.
- Leverage Python and relevant libraries (e.g., Pandas, NumPy) for data cleaning, transformation, and analysis.
- Design and develop interactive and insightful data visualizations using tools like [Specify Visualization Tools - e.g., Tableau, Power BI, Matplotlib, Seaborn, Plotly] to communicate data-driven insights to both technical and non-technical stakeholders.
- Work closely with business analysts and stakeholders to understand their data needs and translate them into effective data models and visualizations.
- Ensure the performance and reliability of data visualization dashboards and reports.
- Stay up-to-date with the latest trends and technologies in data architecture, cloud computing (especially Oracle Cloud), and data visualization.
- Troubleshoot data-related issues and provide timely resolutions.
- Document data architectures, data flows, and data visualization solutions.
- Participate in the evaluation and selection of new data technologies and tools.
Qualifications:
- Bachelor's or Master's degree in Computer Science, Data Science, Information Systems, or a related field.
- Proven experience (typically 5+ years) as a Data Architect, Data Modeler, or similar role.
- Deep understanding of data warehousing concepts, dimensional modeling (e.g., star schema, snowflake schema), and ETL/ELT processes.
- Extensive experience working with relational databases, particularly Oracle, and proficiency in SQL.
- Hands-on experience with Oracle Cloud data services (e.g., Autonomous Data Warehouse, Object Storage, Data Integration).
- Strong programming skills in Python and experience with data manipulation and analysis libraries (e.g., Pandas, NumPy).
- Demonstrated ability to create compelling and effective data visualizations using industry-standard tools (e.g., Tableau, Power BI, Matplotlib, Seaborn, Plotly).
- Excellent analytical and problem-solving skills with the ability to interpret complex data and translate it into actionable insights.
- Strong communication and presentation skills, with the ability to effectively communicate technical concepts to non-technical audiences.
- Experience with data governance and data quality principles.
- Familiarity with agile development methodologies.
- Ability to work independently and collaboratively within a team environment.
Application Link- https://forms.gle/km7n2WipJhC2Lj2r5

· Design, develop, and implement AI/ML models and algorithms.
· Focus on building Proof of Concept (POC) applications to demonstrate the feasibility and value of AI solutions.
· Write clean, efficient, and well-documented code.
· Collaborate with data engineers to ensure data quality and availability for model training and evaluation.
· Work closely with senior team members to understand project requirements and contribute to technical solutions.
· Troubleshoot and debug AI/ML models and applications.
· Stay up-to-date with the latest advancements in AI/ML.
· Utilize machine learning frameworks (e.g., TensorFlow, PyTorch, Scikit-learn) to develop and deploy models.
· Develop and deploy AI solutions on Google Cloud Platform (GCP).
· Implement data preprocessing and feature engineering techniques using libraries like Pandas and NumPy.
· Utilize Vertex AI for model training, deployment, and management.
· Integrate and leverage Google Gemini for specific AI functionalities.
Qualifications:
· Bachelor’s degree in computer science, Artificial Intelligence, or a related field.
· 3+ years of experience in developing and implementing AI/ML models.
· Strong programming skills in Python.
· Experience with machine learning frameworks such as TensorFlow, PyTorch, or Scikit-learn.
· Good understanding of machine learning concepts and techniques.
· Ability to work independently and as part of a team.
· Strong problem-solving skills.
· Good communication skills.
· Experience with Google Cloud Platform (GCP) is preferred.
· Familiarity with Vertex AI is a plus.
AccioJob is conducting an offline hiring drive with Fintech for the position of Backend Developer Intern.
Required Skills - NodeJS
Apply Here - https://go.acciojob.com/hwPWZx
Eligibility -
- Degree: B.Tech/BE/MCA
- Branch: All Branches
- Work Location: Bangalore
Compensation -
- Internship stipend: 50k
- Internship duration: 6 months
- CTC: 10 LPA
Evaluation Process -
- Round 1: Offline Assessment at Atria Institute of Technology, Bangalore
- Further Rounds (for shortlisted candidates only):
- Profile & Background Screening Round,
- Assignment,
- Technical Interview Round 1,
- Techno Managerial Round
Important Note: Bring your laptop & earphones for the test.
Register Here: https://go.acciojob.com/hwPWZx
· Lead generation and developing new business for your assigned territory.
· Educating the clients about new media and convincing them to advertise their product using Eco-friendly media.
· Scheduling meeting with prospective clients and drive them for generating business.
· Handling sales process end to end from pitching to client retention.
· Candidate shall be well aware of retail and corporate clients in the assigned territory.
· Develop a thorough understanding of key client’s needs and requirements and preparing customized solutions.
· Negotiating contracts with clients and meeting established deadlines for the fulfillment of each client's long-term goals.
· Playing an integral role for the effective on-boarding of new clients.
· Dealing with clients and efficiently handling their queries.
· Serve as contact point for key customers and internal teams.
· Suggest solutions that answer client needs and wants.
· Demonstrating good communication, presentation, selling and deal closing skills.
· Driving Business for the company and client retention will be the core expertise.
Roles and
Responsibilities
Seeking AWS Cloud Engineer /Data Warehouse Developer for our Data CoE team to
help us in configure and develop new AWS environments for our Enterprise Data Lake,
migrate the on-premise traditional workloads to cloud. Must have a sound
understanding of BI best practices, relational structures, dimensional data modelling,
structured query language (SQL) skills, data warehouse and reporting techniques.
Extensive experience in providing AWS Cloud solutions to various business
use cases.
Creating star schema data models, performing ETLs and validating results with
business representatives
Supporting implemented BI solutions by: monitoring and tuning queries and
data loads, addressing user questions concerning data integrity, monitoring
performance and communicating functional and technical issues.
Job Description: -
This position is responsible for the successful delivery of business intelligence
information to the entire organization and is experienced in BI development and
implementations, data architecture and data warehousing.
Requisite Qualification
Essential
-
AWS Certified Database Specialty or -
AWS Certified Data Analytics
Preferred
Any other Data Engineer Certification
Requisite Experience
Essential 4 -7 yrs of experience
Preferred 2+ yrs of experience in ETL & data pipelines
Skills Required
Special Skills Required
AWS: S3, DMS, Redshift, EC2, VPC, Lambda, Delta Lake, CloudWatch etc.
Bigdata: Databricks, Spark, Glue and Athena
Expertise in Lake Formation, Python programming, Spark, Shell scripting
Minimum Bachelor’s degree with 5+ years of experience in designing, building,
and maintaining AWS data components
3+ years of experience in data component configuration, related roles and
access setup
Expertise in Python programming
Knowledge in all aspects of DevOps (source control, continuous integration,
deployments, etc.)
Comfortable working with DevOps: Jenkins, Bitbucket, CI/CD
Hands on ETL development experience, preferably using or SSIS
SQL Server experience required
Strong analytical skills to solve and model complex business requirements
Sound understanding of BI Best Practices/Methodologies, relational structures,
dimensional data modelling, structured query language (SQL) skills, data
warehouse and reporting techniques
Preferred Skills
Required
Experience working in the SCRUM Environment.
Experience in Administration (Windows/Unix/Network/
plus.
Experience in SQL Server, SSIS, SSAS, SSRS
Comfortable with creating data models and visualization using Power BI
Hands on experience in relational and multi-dimensional data modelling,
including multiple source systems from databases and flat files, and the use of
standard data modelling tools
Ability to collaborate on a team with infrastructure, BI report development and
business analyst resources, and clearly communicate solutions to both
technical and non-technical team members

Play or available on Github
3+ years in mobile software development
Proficiency in Java, Kotlin, or C++.
Ability to use the Android Studio, including the Android SDK, with ease.●
Collaborating with UI and UX Designers, as well as Software Testers, to ensure that
each app is presentable and in perfect working order.
Experience with third-party libraries and APIs
Experience with automated testing and building
Experience with Git, Jenkins, or other version control tools
Ability to write readable code, create extensive documentation for existing code,
and refactor the previously written code into a readable state
Intermediate English skills
BONUS, IF YOU ALREADY KNOW: Flutter or React Native

Job type: Permanent with ACL Digital
Client: Concentrix
Required skills:
- MySQL, or Postgres
- Linux systems
- Agile Methodologies
- Good communication skills
1.Make proposals for clients
2. Negotiate with clients
3. Coordinate with team on projects
4. Generate new business opportunities
5. Networking, Marketing, and Calling
6. Introduce and Market the company

