11+ MDI Jobs in Bangalore (Bengaluru) | MDI Job openings in Bangalore (Bengaluru)
Apply to 11+ MDI Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest MDI Job opportunities across top companies like Google, Amazon & Adobe.
Note: No Work from home
Required skills and experience
Should have 2+ years of development experience in C++.
Prefer someone with experience of VC++ and MFC.
Strong Object Oriented Design skills & C/C++, VC++ programming Skills
Experience on MFC based GUI designing preferred
Experience on developing applications on Dialog based, MDI, and SDI architectures
Experience on developing win32 or MFC based DLLs and Libraries
Hands-on experience in implementing multi-threaded applications
Responsibilities:
- Lead the design, development, and implementation of scalable data architectures leveraging Snowflake, Python, PySpark, and Databricks.
- Collaborate with business stakeholders to understand requirements and translate them into technical specifications and data models.
- Architect and optimize data pipelines for performance, reliability, and efficiency.
- Ensure data quality, integrity, and security across all data processes and systems.
- Provide technical leadership and mentorship to junior team members.
- Stay abreast of industry trends and best practices in data architecture and analytics.
- Drive innovation and continuous improvement in data management practices.
Requirements:
- Bachelor's degree in Computer Science, Information Systems, or a related field. Master's degree preferred.
- 5+ years of experience in data architecture, data engineering, or a related field.
- Strong proficiency in Snowflake, including data modeling, performance tuning, and administration.
- Expertise in Python and PySpark for data processing, manipulation, and analysis.
- Hands-on experience with Databricks for building and managing data pipelines.
- Proven leadership experience, with the ability to lead cross-functional teams and drive projects to successful completion.
- Experience in the banking or insurance domain is highly desirable.
- Excellent communication skills, with the ability to effectively collaborate with stakeholders at all levels of the organization.
- Strong problem-solving and analytical skills, with a keen attention to detail.
Benefits:
- Competitive salary and performance-based incentives.
- Comprehensive benefits package, including health insurance, retirement plans, and wellness programs.
- Flexible work arrangements, including remote options.
- Opportunities for professional development and career advancement.
- Dynamic and collaborative work environment with a focus on innovation and continuous learning.
Greetings!!!
We are hiring for the position of "Starlims Developer" for one of the IT MNCs.
Exp: 5.5 - 12 yrs
Loc: PAN India
Skills: Starlims, Starlims developer, SQL
Job Description:
- Need only Starlims developer.
- Candidate should have Experience in SSL(Starlims Scripting Language).
- Candidate should have Experience in DB Design- Optimization- MS SQL, and web services.
Shvasa is a yoga startup focussed on taking authentic yoga practices from India to the western world.
We are a team of internet entrepreneurs who love yoga as much as we love science.
Our vision is to lighten thw load of humanity one breath at a time.
The real pandemic in the world is stress, anxiety, and procrastination. Yoga has all the tools which if practiced consistently can help you reach your highest potential. Here we bring the best teachers on our platform and give them all the tools necessary to make a practitioner's journey easy and super effective.
We are a small team of calm and passionate folks who would want like-minded people, who share the love for yoga and also are the best in their respective specializations, to join our team.
- Handle the dual role of being the face of the brand to the customers, and the voice of the customer to the company.
- Handling customer queries related to Shvasa's products, brand, sales, payments and other related topics.
- Own the Live chat feature and inbound customer calls
- Identify and assess customers' needs to achieve satisfaction
- Build sustainable relationships and trust with customer accounts through open and interactive communication
- Create and present reports as required.
- Provide internal teams with constant feedback to improve products and processes as per valid customer needs.
Key skills required
- Proven customer support experience or experience as a client service representative
- Exposure to US customers, with excellent oral and written communication skills in English.
- Ability to independently handle queries, proactively update one's own knowledge by keeping in touch with other stakeholders.
- A customer focussed mindset with the sensitivity to handle complex queries and escalations as well.
- Ability to work in fast paced and high pressure environment
- Ability to problem solve, multi-task, prioritize, and manage time effectively Note: This is a night shift role.
- Create and lead innovative programs, software, and analytics that drive improvements to the availability, scalability, efficiency, and latency of the application
- Work with Directors to define and execute technical strategy and arbitrate technical processes and decisions.
- Work with development teams to guide future technology choices and foster cross-team collaboration
- Work across all teams to help define and clarify requirements, explore technical feasibility, and help define product directions and plans.
- Work closely with product & engineering leaders and tech leads to turn requirements into actionable plans that teams can understand.
- Define, Refine & Develop POCs in quick turnaround time and demonstrate the same to all stakeholders effortlessly
- Embed with teams from time to time to write code, provide technical guidance, etc.
- Very deep knowledge of the entire tech stack. Excellent understanding of the entire SDLC.
- Provides a multiplier effect in getting stuff done.
- Acts as SME for the team or product area within the Engineering organization and for cross-functional organizations as well.
- Teaches others why new features are important. Good understanding of customer use cases.
- A Bachelor's degree in any technical discipline
- Minimum 5 years of experience administering AEM applications
- Build management using Bamboo/Jenkins or relevant tech
- Configuration and Release management
- Design and build enablement
- Good communication skills
Experience : 0-1 Year ( Freshers are welcome)
Job Location: Banashankari, Bangalore
Roles & Responsibilities:
- Read and understand the scope of work, and deliverables for the assigned project.
- Get confirmation on the same from TL/Manager.
- Ensure completeness/adequacy of data for a specified project. Coordinate with the site team.
- Have a good understanding of the relevant standards, and testing procedures.
- Prepare the reports as per the standard formats. Ensure customization is done if required from the client end.
- Keep track of completed, ongoing and upcoming jobs.
- Participate in the report discussions.
- Actively involve in training
- Experience in 3DCS(preferred) Stack up tool or any other 3D Tolerance analysis tool
- Experience in performing Tolerance stack up calculations (1D & 2D) like Worst case, RSS etc
- Experience in using GD&T symbols in drawing (ASME or ISO)
- Knowledge in Parts & Assembly inspection technics (CMM, Inline Measurement etc.)
- Knowledge in Assembly technics (Basic understanding of jig and fixture concept)
- Knowledge in Press, Injection moulding & Casting tools.
- Exposure to Component / Overall vehicle development process
- Fair communication and interpersonal skills.
BRIEF DESCRIPTION:
At-least 1 year of Python, Spark, SQL, data engineering experience
Primary Skillset: PySpark, Scala/Python/Spark, Azure Synapse, S3, RedShift/Snowflake
Relevant Experience: Legacy ETL job Migration to AWS Glue / Python & Spark combination
ROLE SCOPE:
Reverse engineer the existing/legacy ETL jobs
Create the workflow diagrams and review the logic diagrams with Tech Leads
Write equivalent logic in Python & Spark
Unit test the Glue jobs and certify the data loads before passing to system testing
Follow the best practices, enable appropriate audit & control mechanism
Analytically skillful, identify the root causes quickly and efficiently debug issues
Take ownership of the deliverables and support the deployments
REQUIREMENTS:
Create data pipelines for data integration into Cloud stacks eg. Azure Synapse
Code data processing jobs in Azure Synapse Analytics, Python, and Spark
Experience in dealing with structured, semi-structured, and unstructured data in batch and real-time environments.
Should be able to process .json, .parquet and .avro files
PREFERRED BACKGROUND:
Tier1/2 candidates from IIT/NIT/IIITs
However, relevant experience, learning attitude takes precedence
Job title: Azure Architect
Locations: Noida, Pune, Bangalore and Mumbai
Responsibilities:
- Develop and maintain scalable architecture, database design and data pipelines and build out new Data Source integrations to support continuing increases in data volume and complexity
- Design and Develop the Data lake, Data warehouse using Azure Cloud Services
- Assist in designing end to end data and Analytics solution architecture and perform POCs within Azure
- Drive the design, sizing, POC setup, etc. of Azure environments and related services for the use cases and the solutions
- Reviews the solution requirements support architecture design to ensure the selection of appropriate technology, efficient use of resources and integration of multiple systems and technology.
- Must possess good client-facing experience with the ability to facilitate requirements sessions and lead teams
- Support internal presentations to technical and business teams
- Provide technical guidance, mentoring and code review, design level technical best practices
Experience Needed:
- 12-15 years of industry experience and at least 3 years of experience in architect role is required along with at least 3 to 4 years’ experience designing and building analytics solutions in Azure.
- Experience in architecting data ingestion/integration frameworks capable of processing structured, semi-structured & unstructured data sets in batch & real-time
- Hands-on experience in the design of reporting schemas, data marts and development of reporting solutions
- Develop batch processing, streaming and integration solutions and process Structured and Non-Structured Data
- Demonstrated experience with ETL development both on-premises and in the cloud using SSIS, Data Factory, and Azure Analysis Services and other ETL technologies.
- Experience in Perform Design, Development & Deployment using Azure Services ( Azure Synapse, Data Factory, Azure Data Lake Storage, Databricks, Python and SSIS)
- Worked with transactional, temporal, time series, and structured and unstructured data.
- Deep understanding of the operational dependencies of applications, networks, systems, security, and policy (both on-premise and in the cloud; VMs, Networking, VPN (Express Route), Active Directory, Storage (Blob, etc.), Windows/Linux).
Mandatory Skills: Azure Synapse, Data Factory, Azure Data Lake Storage, Azure DW, Databricks, Python
Experience with Rest APIs;
NodeJS, Webpack;
Grunt, Gulp;
Git, SVN;
Proficient in HTML, CSS (LESS/SASS), responsive design, semantic markup;
Awareness of cross-browser compatibility issues and client-side performance consideration




