
About Malaysian Global Innovation and Creativity Centre
About
Company video


Connect with the team
Company social profiles
Similar jobs
🚀 Hiring: Postgres DBA at Deqode
⭐ Experience: 6+ Years
📍 Location: Pune & Hyderabad
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
Looking for an experienced Postgres DBA with:-
✅ 6+ years in Postgres & strong SQL skills
✅ Good understanding of database services & storage management
✅ Performance tuning & monitoring expertise
✅ Knowledge of Dataguard admin, backups, upgrades
✅ Basic Linux admin & shell scripting
We are seeking an experienced Data Scientist to join our data-driven team. As a Data Scientist, you will work with large datasets, apply advanced analytics techniques, and build machine learning models to provide actionable insights that drive business decisions. You will collaborate with various teams to translate complex data into clear recommendations and innovative solutions.
Key Responsibilities:
- Analyze large datasets to identify trends, patterns, and insights that can inform business strategy.
- Develop, implement, and maintain machine learning models and algorithms to solve complex problems.
- Work closely with stakeholders to understand business objectives and translate them into data science tasks.
- Preprocess, clean, and organize raw data from various sources for analysis.
- Conduct statistical analysis and build predictive models to support data-driven decision-making.
- Create data visualizations and reports to communicate findings clearly and effectively to both technical and non-technical teams.
- Design experiments and A/B testing to evaluate business initiatives.
- Ensure the scalability and performance of data pipelines and machine learning models.
- Collaborate with engineering teams to integrate data science solutions into production systems.
- Continuously stay updated with the latest developments in data science, machine learning, and analytics technologies.
Cloudera Data Warehouse Hive team looking for a passionate senior developer to join our growing engineering team. This group is targeting the biggest enterprises wanting to utilize Cloudera’s services in a private and public cloud environment. Our product is built on open source technologies like Hive, Impala, Hadoop, Kudu, Spark and so many more providing unlimited learning opportunities.A Day in the LifeOver the past 10+ years, Cloudera has experienced tremendous growth making us the leading contributor to Big Data platforms and ecosystems and a leading provider for enterprise solutions based on Apache Hadoop. You will work with some of the best engineers in the industry who are tackling challenges that will continue to shape the Big Data revolution. We foster an engaging, supportive, and productive work environment where you can do your best work. The team culture values engineering excellence, technical depth, grassroots innovation, teamwork, and collaboration.
You will manage product development for our CDP components, develop engineering tools and scalable services to enable efficient development, testing, and release operations. You will be immersed in many exciting, cutting-edge technologies and projects, including collaboration with developers, testers, product, field engineers, and our external partners, both software and hardware vendors.Opportunity:Cloudera is a leader in the fast-growing big data platforms market. This is a rare chance to make a name for yourself in the industry and in the Open Source world. The candidate will responsible for Apache Hive and CDW projects. We are looking for a candidate who would like to work on these projects upstream and downstream. If you are curious about the project and code quality you can check the project and the code at the following link. You can start the development before you join. This is one of the beauties of the OSS world.Apache Hive
Responsibilities:
•Build robust and scalable data infrastructure software
•Design and create services and system architecture for your projects
•Improve code quality through writing unit tests, automation, and code reviews
•The candidate would write Java code and/or build several services in the Cloudera Data Warehouse.
•Worked with a team of engineers who reviewed each other's code/designs and held each other to an extremely high bar for the quality of code/designs
•The candidate has to understand the basics of Kubernetes.
•Build out the production and test infrastructure.
•Develop automation frameworks to reproduce issues and prevent regressions.
•Work closely with other developers providing services to our system.
•Help to analyze and to understand how customers use the product and improve it where necessary.
Qualifications:
•Deep familiarity with Java programming language.
•Hands-on experience with distributed systems.
•Knowledge of database concepts, RDBMS internals.
•Knowledge of the Hadoop stack, containers, or Kubernetes is a strong plus.
•Has experience working in a distributed team.
•Has 3+ years of experience in software development.
Hello
About the Company
Established in the year 1953, Our Client is one of the leading TMT bar manufacturer and exporter in India. It is a well known group that achieves a turn over of 3000 Cr per annum. The group is setting a construction & architecture oriented omni channel B2B & B2C platform
Job Description
- Gather intelligence from key business leaders about needs and future growth
- Partner with the internal IT team to ensure each project meets a specific need and resolves successfully
- Assume responsibility for project tasks and ensure they are completed in a timely fashion
- Evaluate, test and recommend new opportunities for enhancing our software, hardware and IT processes
- Compile and distribute reports on application development and deployment
- Design and execute A/B testing procedures to extract data from test runs
- Evaluate and draw conclusions from data related to customer behavior
- Consult with the executive team and the IT department on the newest technology and its implications in the industry
Requirements:
- Bachelor’s Degree in Software Development, Computer Engineering, Project Management or related field
- 5+ years’ experience in technology development and deployment
Regards
Team Merito
NET Lead (Need B3) Job Description:
Responsibilities / Expectations
- Tech/team lead requirement in the ICS Simplification domain in the MAAS application.
- 5-8 years of total IT experience.
- At least 4 years in Application Development/Maintenance/Support using .NET Framework
- Should be able to perform migration of legacy applications to Cloud/On Prem by thoroughly understanding the integration and compatibility requirements
- Should be able to Debug and resolve the Application related issues with Migration, and compatibility with the latest Windows/RHEL environment
Skills required
Technical Skills (Must have)
- Strong understanding of .NET Architecture and Compatibility requirements
- Understanding of Data Architecture, and Implementing Databases
- Understanding on Data Migrations, Data Integrations.
- Application & Application security knowledge (certificates/authentication/authorization)
Technical Skills (Good to have)
- knowledge in Cloud resources - Storage, Networking, Security, Identity, Management.
- Experience in Migration of legacy applications to Cloud/On Prem
Soft Skills
- Should interact / communicate effectively with Different domains for application Installation and issue resolution
- Need to interact with other teams related to any integrations with application migration.
- Effective Stakeholder/Customer Management.
- Engaging with necessary stakeholders and SMEs.
- Good Problem Solving skills and approach
- Team handling and Mentoring
- Handling and Minimising Escalations.
Weekly/ fortnightly/ Monthly status to customer & connect with customer.
Responsibilities
- Understanding the business requirements so as to formulate the problems to solve and restrict the slice of data to be explored.
- Collecting data from various sources.
- Performing cleansing, processing, and validation on the data subject to analyze, in order to ensure its quality.
- Exploring and visualizing data.
- Performing statistical analysis and experiments to derive business insights.
- Clearly communicating the findings from the analysis to turn information into something actionable through reports, dashboards, and/or presentations.
- Preparing business dashboards for teams to add transparency in the process and uncover bottlenecks
- Conceive and prepare product dashboards to highlight transparently the user journey on the BitClass platform and outline bottlenecks/wins in the same.
Skills
- Experience solving problems in the project’s business domain.
- Experience with data integration from multiple sources
- Proficiency in at least one query language, especially SQL.
- Working experience with NoSQL databases, such as MongoDB and Elasticsearch.
- Working experience with popular statistical and machine learning techniques, such as clustering, linear regression, KNN, decision trees, etc.
- Good scripting skills using Python, R or any other relevant language
- Proficiency in at least one data visualization tool, such as Matplotlib, Plotly, D3.js, ggplot, etc.
- Great communication skills.
- Total Experience of 7-10 years and should be interested in teaching and research
- 3+ years’ experience in data engineering which includes data ingestion, preparation, provisioning, automated testing, and quality checks.
- 3+ Hands-on experience in Big Data cloud platforms like AWS and GCP, Data Lakes and Data Warehouses
- 3+ years of Big Data and Analytics Technologies. Experience in SQL, writing code in spark engine using python, scala or java Language. Experience in Spark, Scala
- Experience in designing, building, and maintaining ETL systems
- Experience in data pipeline and workflow management tools like Airflow
- Application Development background along with knowledge of Analytics libraries, opensource Natural Language Processing, statistical and big data computing libraries
- Familiarity with Visualization and Reporting Tools like Tableau, Kibana.
- Should be good at storytelling in Technology
Qualification: B.Tech / BE / M.Sc / MBA / B.Sc, Having Certifications in Big Data Technologies and Cloud platforms like AWS, Azure and GCP will be preferred
Primary Skills: Big Data + Python + Spark + Hive + Cloud Computing
Secondary Skills: NoSQL+ SQL + ETL + Scala + Tableau
Selection Process: 1 Hackathon, 1 Technical round and 1 HR round
Benefit: Free of cost training on Data Science from top notch professors
Roles & Responsibilities -
This position is a hands-on Python / SQL software developer role, candidate needs to have exposure to electronic trading business, proficient in python and experience building systems for data processing. The candidate will join the front office development team
- Build and maintain infrastructure for data retrieval, processing and storage
- Build strong working relationships with international teams
- Be willing and able to adapt to changes in priorities
- Ability to learn and apply new technologies to deliver added business value
- Maintain a strong focus on quality
Skill Sets & Prerequisites -
- Proficient in Python / SQL
- Good understanding of data base management systems
- Experience developing and maintaining systems that handle large amounts of data
- Understanding of Electronic Trading Systems
- Attention to detail and code quality
- Excellent problem solving and analytical skills in a high-pressure environment
- Strong communication skills and an ability to convey ideas and concepts with clarity
Experience: 12 - 20 years
Responsibilities :
The Cloud Solution Architect/Engineer specializing in migrations is a cloud role in the project delivery cycle with hands on experience migrating customers to the cloud.
Demonstrated experience in cloud infrastructure project deals for hands on migration to public clouds such as Azure.
Strong background in linux/Unix and/or Windows administration
Ability to use wide variety of open source technologies.
Closely work with Architects and customer technical teams in migrating applications to Azure cloud in Architect Role.
Mentor and monitor the junior developers and track their work.
Design as per best practices and insustry standard coding practices
Ensure services are built for performance, scalability, fault tolerance and security with reusable patterns.
Recommend best practises and standards for Azure migrations
Define coding best practices for high performance and guide the team in adopting the same
Skills:
Mandatory:
Experience with cloud migration technologies such as Azure Migrate
Azure trained / certified architect – Associate or Professional Level
Understanding of hybrid cloud solutions and experience of integrating public cloud into tradition hosting/delivery models
Strong understanding of cloud migration techniques and workflows (on premise to Cloud Platforms)
Configuration, migration and deployment experience in Azure apps technologies.
High Availability and Disaster recovery implementations
Experience architecting and deploying multi-tiered applications.
Experience building and deploying multi-tier, scalable, and highly available applications using Java, Microsoft and Database technologies
Experience in performance tuning, including the following ; (load balancing, web servers, content delivery Networks, Caching (Content and API))
Experience in large scale data center migration
Experience of implementing architectural governance and proactively managing issues and risks throughout the delivery lifecycle.
Good familiarity with the disciplines of enterprise software development such as configuration & release management, source code & version controls, and operational considerations such as monitoring and instrumentation
Experience of consulting or service provider roles (internal, or external);
Experience using database technologies like Oracle, MySQL and understanding of NoSQL is preferred.
Experience in designing or implementing data warehouse solutions is highly preferred.
Experience in automation/configuration management using Puppet, Chef, Ansible, Saltstack, Bosh, Terraform or an equivalent.
Experience with source code management tools such as GitHub, GitLab, Bitbucket or equivalent
Experience with SQL and NoSQL DBs such as SQL, MySQL.
Solid understanding of networking and core Internet Protocols such as TCP/IP, DNS, SMTP, HTTP and routing in distributed networks.
A working understanding of code and script such as: PHP, Python, Perl and/or Ruby.
A working understanding with CI/CD tools such as Jenkins or equivalent
A working understanding of scheduling and orchestration with tools such as: kubernetes, Mesos swarm or equivalent.







