
Skills / Tools
- SQL & Python / PySpark
- AWS Services: Glue, Appflow, Redshift - Mandatory
- Data warehousing basics
- Data modelling basics
Job Description
- Experience of implementing and delivering data solutions and pipelines on AWS Cloud Platform. Design/ implement, and maintain the data architecture for all AWS data services
- A strong understanding of data modelling, data structures, databases (Redshift), and ETL processes
- Work with stakeholders to identify business needs and requirements for data-related project
- Strong SQL and/or Python or PySpark knowledge
- Creating data models that can be used to extract information from various sources & store it in a usable format
- Optimize data models for performance and efficiency
- Write SQL queries to support data analysis and reporting
- Monitor and troubleshoot data pipelines
- Collaborate with software engineers to design and implement data-driven features
- Perform root cause analysis on data issues
- Maintain documentation of the data architecture and ETL processes
- Identifying opportunities to improve performance by improving database structure or indexing methods
- Maintaining existing applications by updating existing code or adding new features to meet new requirements
- Designing and implementing security measures to protect data from unauthorized access or misuse
- Recommending infrastructure changes to improve capacity or performance
- Experience in Process industry

About ProtoGene Consulting Private Limited
About
Connect with the team
Similar jobs
Job Title: Salesforce Intelligence Cloud Specialist (Datorama)
Location:Mumbai, MH, India, Remote
Experience: 3-6 year in Salesforce Intelligence Cloud or Datorama
Job Summary:
We are looking for a skilled Salesforce Intelligence Cloud Specialist with extensive experience in Datorama to create and manage analytics dashboards for our clients. The role involves integrating data from various sources, including Google Analytics, Snowflake, and Salesforce Marketing Cloud, to provide actionable insights via Datorama.
Key Responsibilities:
- Develop and maintain Datorama dashboards to deliver comprehensive analytics reports for clients.
- Integrate and manage data from Google Analytics, Snowflake, and Salesforce Marketing Cloud within Datorama.
- Ensure data accuracy across all data sources and dashboards, troubleshooting discrepancies where necessary.
- Collaborate with internal teams to understand business requirements and translate them into data-driven insights.
- Use Datorama connectors, APIs, and custom data streams to automate data ingestion and reporting.
- Monitor dashboard performance and optimize data pipelines for efficiency and accuracy.
- Provide training and support to internal teams and clients on Datorama usage and data interpretation.
Required Skills and Qualifications:
- Hands-on experience with Salesforce Intelligence Cloud (Datorama), including dashboard creation and advanced reporting.
- Strong knowledge of Google Analytics, Snowflake, and Salesforce Marketing Cloud data integration with Datorama.
- Proficiency in Datorama connectors, APIs, and working with multiple data streams.
- Experience in data modeling, ETL processes, and best practices for data visualization and reporting.
- Strong analytical skills, with the ability to troubleshoot and ensure data integrity across multiple sources.
- Excellent communication skills for explaining complex technical information to non-technical stakeholders.
Preferred Qualifications:
- Familiarity with SQL, especially for querying databases such as Snowflake.
- Experience working with Salesforce Marketing Cloud data and building custom solutions.
- Salesforce certifications related to Datorama or Salesforce Intelligence Cloud are a plus.
Education:
- Bachelor’s degree in computer science, Data Analytics, Information Systems, or a related field.
Mail updated resume with salary detil-
Email: jobs[at]glansolutions[dot]com
Satish- 8851O18162
Job Description
A Product Designer is a professional who is in charge of translating the wants and needs of consumers into product design features. They must have a creative eye, be able to think outside the box in order to come up with new ideas or solutions that meet customer expectations, and create digital or print drawings as well as design fully-functional products
Roles and Responsibilities:
1. User Research: ○ Conduct user research to understand user needs, preferences, and behaviors. ○ Use research findings to inform design decisions and enhance the overall user experience.
2. Wireframing and Prototyping: ○ Create wireframes, user flows, and prototypes to visualize and communicate design ideas. ○ Iterate on designs based on feedback from stakeholders and usability testing.
3. Visual Design: ○ Develop visually engaging and consistent UI designs for digital products. ○ Ensure designs align with brand guidelines and maintain a cohesive visual identity.
4. Interaction Design: ○ Design and implement interactive elements to enhance user engagement and satisfaction. ○ Collaborate with development teams to ensure seamless integration of design and functionality.
5. Usability Testing: ○ Plan and conduct usability testing sessions to gather feedback and validate design decisions. ○ Analyze testing results and iterate on designs to improve overall usability.
6. Collaboration: ○ Work closely with cross-functional teams, including developers, product managers, and other stakeholders. ○ Communicate design decisions effectively and advocate for the user throughout the product development process.
7. Stay Updated: ○ Keep abreast of industry trends, best practices, and emerging technologies in UI/UX design. ○ Continuously improve skills and contribute to a culture of design excellence.
Experience and Skills you MUST have:
● Proven experience as a UI/UX Designer or similar role.
● Proficient in design tools such as Sketch, Figma, Adobe XD, or similar.
● Strong portfolio showcasing a range of design projects and problem-solving skills.
● Excellent communication and collaboration skills.
● Understanding of front-end development technologies and constraints.
● Familiarity with agile development processes
Developing core infrastructure in Python, Django.
- Developing models and business logic (e. g. transactions, payments, diet plan, search, etc).
- Architecting servers and services that enable new product features.
- Building out newly enabled product features.
- Monitoring system uptime and errors to drive us toward a high-performing and reliable product.
- Take ownership and understand the need for code quality, elegance, and robust infrastructure.
- Worked collaboratively on a software development team.
- Built scalable web applications.
Skills:
- Minimum 4 years of industry or open-source experience.
- Proficient in at least one OO language: Python(preferred)/Golang/Java.
- Writing high-performance, reliable and maintainable code.
- Good knowledge of database structures, theories, principles, and practices.
- Experience working with AWS components [EC2, S3, RDS, SQS, ECS, Lambda]
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
Responsibilities
- Operational tasks
- Should be able to work in the US East Time zone idenpendtly
- Ensure the day-to-day functioning of team members operating hybrid is seamless.
- Ensure all team members are provided with all the necessary software, applications, assets, etc.
- Make co-working arrangements when the need arises
- Find the best options in the procurement of assets, tools, software, and services with regard to quality and cost.
- Take necessary steps to ensure all the operational tasks are undertaken with an eye on quality and value for money.
- IT Support
- Maintain IT software, tools, and applications repository.
- User management with regards to all the required software, tools, and applications generic and project-specific.
- Purchase, renewal, and cancellation of the software, tools, and applications.
- Ensure all the software, tools and applications have the least possible downtime.
- Logistics
- Take ownership of the logistical arrangements for company initiatives, programs or events.
- Vendor management
- Team Member Engagement
- Initiate, support, or coordinate events that play a role in team member engagement.
- Collaborate with other support teams, technology teams, and other stakeholders in their initiatives.
- Own the health insurance platform, its user management, team member support, renewal, and cost optimization by ensuring we are getting the best from the vendor.
- Other
- Own timely communications to all stakeholders about new initiatives, changes, downtimes, maintenance, etc.
- Achieve and maintain quality and efficiency in all tasks assigned.
- Align with the company's core values, respect, and adhere to all company guidelines.
- Operate with utmost integrity, behave ethically, and maintain respect for all.
- Event management
Qualifications
- Bachelor's Degree in Human Resources or related field MBA-IT Systems/Operations, BCA with MBA.
- Good communication skills
- Any prior experience is good to have
- People-oriented and results-driven
- Working knowledge of Google Workspace, Gmail, Docs, Sheets, Slides, and Drive.
- Working knowledge of system and data security tools.
- Should be a tech-savvy individual with the ability to learn new tools, software, and applications. Their uses, management, and basic troubleshooting.
- The ability to work as part of a team.
- Resourceful, self-motivated, and proactive.
- Strong analytical and problem-solving skills.
- Excellent administrative and organizational skills.
The responsibilities and duties section is the most important part of the job description. Please read them below and if interested, do go ahead and apply!
Roles and Responsibilites:
- Manages talent acquisition process, including sourcing, testing, interviewing, hiring and onboarding
- Keeps job descriptions up-to-date, accurate and compliant with relevant federal, state and local laws for all positions
- Develops training and performance management program that ensures all employees are familiar with their job responsibilities, as well as relevant legal and safety requirements
- Creates and updates compensation strategy through market analysis and pay surveys
- Handles investigation and resolution of employee issues, concerns and conflicts
- Ensures all employment practices comply with federal, state and local regulations
Gynecologists give reproductive and sexual health services that include pelvic exams, Pap tests, cancer screenings, and testing and treatment for vaginal infections. They diagnose and treat reproductive system disorders such as endometriosis, infertility, ovarian cysts, and pelvic pain.
Diploma in Gynaecology & Obstetrics (D. G. O.) – 2 years. Doctor of Medicine (MD) in Gynaecology & Obstetrics – 3 years. Master of Surgery (MS) in Gynaecology & Obstetrics – 3 years.
Degree: Master of Surgery
Salary : 2 lakh month
- Strong knowledge of PHP for backend work (3+ years’ experience)
- Experience with an MVC framework (CakePHP, Laravel, Symfony, CodeIgniter, etc.)
- Strong knowledge of Javascript, HTML, CSS and related tools/packages (3+ years’ experience),
- Strong knowledge of MySQL (3+ years’ experience)
- Knowledge of React
- Though not required, experience with Python, Golang, DynamoDB, Elasticsearch.
- Strong English skills (written and verbal)
About Wise
Wise is digital infrastructure for online education. Oversimplified as ‘Shopify for Tutors’ i.e., an easy to use tool for tutors to start teaching online. We have grown to 2mn users in 12 months and have much to do. We are backed by an incredible set of investors and individuals who have built product you already use. We have a small but rockstar team, cheap equity, smart money, opportunity to capture a huge market ($60bn) and a chance to make an impact while doing so.
What we need
We need you, if you are a senior software engineer with 4+ years of experience in building robust scalable systems. Work would involve building things quickly and adapting based on market feedback while ensuring the security and quality of the codebase. If you love coding and building things excites you, we would like to talk to you.
What you will need (Skills)
- Experience building microservices and distributed systems, RESTful APIs, user interfaces
- Knowledge and will of writing unit, integration tests is a must. TDD is a plus
- Preferably a polyglot: Strong command Ruby/Golang/Java/Python
- Some exposure to infra: Deployments, CI/CD setup, security. AWS exposure is good to have
- Strong database concepts. Experience with MongoDB is a plus
What you will do
- Feature analysis, hands on development, code reviews, deployment & rollouts
- Passionately maintain coding practices, quality & good design standards
- Design and develop highly scalable, available, secure and fault tolerant systems
- Actively contribute in assessing & improving/optimizing security & infrastructure
Great to have
- Sense of ownership
- Developers who can talk product and strategy
- Experience in building scalable consumer products
- Strong opinions, loosely held
Good to have
- Interest in mentorship
- Good written communication skills
- Bangalore love
Probably not ideal if
- If you have always been right about stuff in life
- No flexibility around working hours (this doesn’t mean we work long hours but if our systems get attacked in the middle of the night, we would need you to HODORR!)
Definitely get in touch if you
- Would have started the exact same company but willing to build it together with a great team
- Find our current product unbearably slow or inadequate
- Think this company isn’t going to work as it is right now
Required Experience : 2 to 6 Years
Technical Skillsets Required :
- 2+ years of professional software development experience
- 2+ years of object-oriented Java / J2EE development
- Full SDLC experience (requirements gathering, architecture, development, QA, etc- )
- Experience with Spring (MVC, IOP/DI, REST, Security) & Hibernate/Spring
- Experience with SOAP / REST web services
- Knowledge of SQL
- Knowledge of No SQL concepts, understanding of Solr, Redis and Mongo DB is desirable
- Understanding concepts of CDN & Content Management
- Must have worked on any one messaging solutions
- Bonus Points for micro service design and development experience
- Bonus points for any mobile development experience
- Used Agile methodology
- Experience leading or working with cross geography teams
- Bonus Points for experience working on Unix, shell scripting & Build Systems
- Experience in performance optimization is an added advantage
- Bonus points on having Hands-on Experience at least at basic level with Solr / ElasticSearch
- Knowledge on Micro-service architecture must
POSITION :
- You will be actively involved during the entire technology development lifecycle. Responsibilities will include all aspects from design, coding, testing, customer feedback cycle changes and support.
- Primary role in software development with object-oriented Java
- The customized solutions that you will be architecting and developing will also require knowledge and experience with spring / Hibernate, SOAP / REST, and SQL.
- Ability to learn new technologies quickly and willingness to read and digest large existing source code and take ownership on complex component or subsystem to drive improvements and re-architecture
- Work with a team of amazing developers and designers involved in the design and development of global platforms
- Invent and prototype new features, build, test and ship them to customers as SaaS, cloud based or hosted product platforms
- Drive the implementation of new technologies which improve our ability to build great customers products.
- Participate in a fun, open learning environment with great benefits and smart talented folks that represent among the best globally.
QUALIFICATION :
- A Bachelor's degree in Computer Science (or equivalent experience)
- M-Tech or advanced degree a plus
- Hackathon participation and accolades are a plus
Job Role :
- Quinbay is an upstart digital platforms and products company with a core focus on disrupting traditional markets and business models with the strength of our innovation driven digital value. We harness the power of our open innovation culture, our unique talent selection approach, the skills and expertise of our people across various industries, domains and technologies and a unique blend of analytics driven strategies for creating future digital platforms.
- OPPORTUNITIES : Pursue opportunities in and learning/growth interests in Mobile, product development, product management, Analytics, Machine learning, UI/UX design, DevOps, Automation, Drones, bots and Java / open source development frameworks and methodologies - on a variety of domains like eCommerce, Logistics and fulfilment, Mcommerce and a whole lot others.
- Opportunities to be part of fastest growing eCommerce platforms in exciting application areas like core commerce, supply chain, logistics and fulfilment, merchant, mobile commerce, analytics, automation and a lot more for the global markets positioned for Asia.








