
Customer Relation Executive
at Shopping for your kid is no child's play and we help you

Similar jobs

Job Title
Technology Lead – Frontend
Job Responsibilities:
- Collaboratively work with UX team, product managers, back-end engineers and other technical teams to create new, delightful and scalable UI
- Mentor junior members through training and individual support
- Responsible for all front end development on a project
- Ensure the technical feasibility of UI/UX designs
- Optimize application for maximum speed and scalability
- Identify and troubleshoot UX problems (e.g. responsiveness)
- Conduct layout adjustments based on user feedback
- Adhere to style standards on fonts, colors and images
- Optimize existing user interface designs
- Proactively learn and accordingly apply new technologies, best design practices and usability patterns
The ideal candidate must -
- 5+ years of experience working with React or React-based frameworks
- Proficiency with browser-based debugging and performance testing
- Proficiency in Git and version control
- Considerable experience with a testing framework (Jest/Mocha)
- Familiarity with RESTful APIs and GraphQL
- Knowledge of modern authorization mechanisms, such as JWT Tokens
- Good understanding of common front-end development tools such as Babel, Webpack, NPM, Yarn etc
- Hands-on experience with CSS-in-JS libraries
- Proficient understanding of cross-browser compatibility issues and ways to work around them
- Keen to learn, mentor and willingness to take ownership of work items
- Possess strong communication and self-accountability
Skills
REACT.JS

About the Role:
We are seeking a talented Lead Data Engineer to join our team and play a pivotal role in transforming raw data into valuable insights. As a Data Engineer, you will design, develop, and maintain robust data pipelines and infrastructure to support our organization's analytics and decision-making processes.
Responsibilities:
- Data Pipeline Development: Build and maintain scalable data pipelines to extract, transform, and load (ETL) data from various sources (e.g., databases, APIs, files) into data warehouses or data lakes.
- Data Infrastructure: Design, implement, and manage data infrastructure components, including data warehouses, data lakes, and data marts.
- Data Quality: Ensure data quality by implementing data validation, cleansing, and standardization processes.
- Team Management: Able to handle team.
- Performance Optimization: Optimize data pipelines and infrastructure for performance and efficiency.
- Collaboration: Collaborate with data analysts, scientists, and business stakeholders to understand their data needs and translate them into technical requirements.
- Tool and Technology Selection: Evaluate and select appropriate data engineering tools and technologies (e.g., SQL, Python, Spark, Hadoop, cloud platforms).
- Documentation: Create and maintain clear and comprehensive documentation for data pipelines, infrastructure, and processes.
Skills:
- Strong proficiency in SQL and at least one programming language (e.g., Python, Java).
- Experience with data warehousing and data lake technologies (e.g., Snowflake, AWS Redshift, Databricks).
- Knowledge of cloud platforms (e.g., AWS, GCP, Azure) and cloud-based data services.
- Understanding of data modeling and data architecture concepts.
- Experience with ETL/ELT tools and frameworks.
- Excellent problem-solving and analytical skills.
- Ability to work independently and as part of a team.
Preferred Qualifications:
- Experience with real-time data processing and streaming technologies (e.g., Kafka, Flink).
- Knowledge of machine learning and artificial intelligence concepts.
- Experience with data visualization tools (e.g., Tableau, Power BI).
- Certification in cloud platforms or data engineering.
1. Collaborate with the team to develop design concepts and visual solutions.
2. Work with our draping software to create virtual draping experiences for our customers.
3. Assist in creating textile patterns and designs for our products.
4. Assist in video editing and video-making tasks to produce engaging promotional content for our brand
5. Apply your knowledge of digital marketing and Instagram marketing to strategize and execute effective social media campaigns that drive brand awareness and engagement.
6. Embrace design thinking principles to solve complex creative challenges and contribute innovative ideas to our design process.
7 Utilize Canva to develop aesthetically pleasing and visually appealing graphics for our online platforms and marketing materials.


Role: Application Architect
Job Description:
Key Responsibilities:
- Understand requirements, functional and non-functional requirements, and devise end to end approach and strategy to modernizeapplications
- Plan, Design, and develop architectures for highly available, scalable, and secure enterprise systems
- Ideate and drive automation, lead PoVs point of view / PoCs Proof of concept/lead in the development of minimal viable product
- Identify risks, and issues, and work on a mitigation plan
- Able to review enterpriseapplicationportfolios and be able to provide application rationalization and modernization strategies
- Provide functional and/or technical expertise to plan, analyze, define, and support the delivery of future functional and technical capabilities for anapplicationor group of applications.
- Assist in facilitating impact assessment efforts and in producing and reviewing estimates for work requests.
Technical Experience:
- Highly experienced withapplicationarchitecture capabilities with cloud-native architectures, microservices, and serverless architectures
- Understanding of Domain-driven design, Event-drivearchitecture
- Equipped with in-depth experience in developingapplicationswith either PHP/Python or databases such as PostgreSQL, MySQL, etc.
Professional Attributes:
- Candidate should have good communication skills
- Candidate should have good documentation and presentation skills
- Candidate should be a team player and must be able to collaborate and network with different teams
- Highly motivated and should work with minimum supervision and unclear requirements

- 5+ years of experience in a Data Engineering role on cloud environment
- Must have good experience in Scala/PySpark (preferably on data-bricks environment)
- Extensive experience with Transact-SQL.
- Experience in Data-bricks/Spark.
- Strong experience in Dataware house projects
- Expertise in database development projects with ETL processes.
- Manage and maintain data engineering pipelines
- Develop batch processing, streaming and integration solutions
- Experienced in building and operationalizing large-scale enterprise data solutions and applications
- Using one or more of Azure data and analytics services in combination with custom solutions
- Azure Data Lake, Azure SQL DW (Synapse), and SQL Database products or equivalent products from other cloud services providers
- In-depth understanding of data management (e. g. permissions, security, and monitoring).
- Cloud repositories for e.g. Azure GitHub, Git
- Experience in an agile environment (Prefer Azure DevOps).
Good to have
- Manage source data access security
- Automate Azure Data Factory pipelines
- Continuous Integration/Continuous deployment (CICD) pipelines, Source Repositories
- Experience in implementing and maintaining CICD pipelines
- Power BI understanding, Delta Lake house architecture
- Knowledge of software development best practices.
- Excellent analytical and organization skills.
- Effective working in a team as well as working independently.
- Strong written and verbal communication skills.
- Expertise in database development projects and ETL processes.
We at Datametica Solutions Private Limited are looking for SQL Engineers who have a passion for cloud with knowledge of different on-premise and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike.
Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.
Job Description
Experience : 4-10 years
Location : Pune
Mandatory Skills -
- Strong in ETL/SQL development
- Strong Data Warehousing skills
- Hands-on experience working with Unix/Linux
- Development experience in Enterprise Data warehouse projects
- Good to have experience working with Python, shell scripting
Opportunities -
- Selected candidates will be provided training opportunities on one or more of the following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and Kafka
- Would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
- Will play an active role in setting up the Modern data platform based on Cloud and Big Data
- Would be part of teams with rich experience in various aspects of distributed systems and computing
About Us!
A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.
We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.
Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.
We have our own products!
Eagle – Data warehouse Assessment & Migration Planning Product
Raven – Automated Workload Conversion Product
Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.
Why join us!
Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.
Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement Policy
Check out more about us on our website below!
http://www.datametica.com/">www.datametica.com
Hi,
Greetings from IP Momentum!!!
We are currently hiring Sales Head/ Business Development Head for our tele communication entity IP Momentum.
The primary purpose of the VOIP Sales Head is to head the team, and drive the sales by proactively prospect and sell VoIP services to potential customers.
Present VoIP services through on site meetings with decision makers. Close sales opportunities.
Process orders accurately and efficiently, ensuring that all company standards and departmental expectations are met Utilize, maintain and continuously improve various sales and communication techniques
Expert in VoIP products; services; prices; promotions; and technical skills Thrive in a self- directed environment Excellent prospector for new business
Qualifications
Only professionals with experience in VOIP Sales targeting Call Centers and Corporate and Enterprises will be processed Able to verbally communicate clearly, persuasively and professionally in order to achieve the desired/expected sales and customer satisfaction goals Able to professionally and concisely communicate and write in English Candidate must possess knowledge, skills and experience in telephone sales
About us -
Skill: AWS, Java, Microservices
Experience: 3-7 Yrs
Location: Noida
Notice Period: Immediate/ 15 days


