
· 10+ years of Information Technology experience, preferably with Telecom / wireless service providers. · Experience in designing data solution following Agile practices (SAFe methodology); designing for testability, deployability and releaseability; rapid prototyping, data modeling, and decentralized innovation
· To be able to demonstrate an understanding and ideally use of, at least one recognised architecture framework or standard e.g. TOGAF, Zachman Architecture Framework etc · The ability to apply data, research, and professional judgment and experience to ensure our products are making the biggest difference to consumers · Demonstrated ability to work collaboratively · Excellent written, verbal and social skills - You will be interacting with all types of people (user experience designers, developers, managers, marketers, etc.) · Ability to work in a fast paced, multiple project environment on an independent basis and with minimal supervision · Technologies: .NET, AWS, Azure; Azure Synapse, Nifi, RDS, Apache Kafka, Azure Data bricks, Azure datalake storage, Power BI, Reporting Analytics, QlickView, SQL on-prem Datawarehouse; BSS, OSS & Enterprise Support Systems |

Similar jobs
JOB DESCRIPTION/PREFERRED QUALIFICATIONS:
REQUIRED SKILLS/COMPETENCIES:
Programming Languages:
- Strong in Python, data structures, and algorithms.
- Hands-on with NumPy, Pandas, Scikit-learn for ML prototyping.
Machine Learning Frameworks:
- Understanding of supervised/unsupervised learning, regularization, feature engineering, model selection, cross-validation, ensemble methods (XGBoost, LightGBM).
Deep Learning Techniques:
- Proficiency with PyTorch or TensorFlow/Keras
- Knowledge of CNNs, RNNs, LSTMs, Transformers, Attention mechanisms.
- Familiarity with optimization (Adam, SGD), dropout, batch norm.
LLMs & RAG:
- Hugging Face Transformers (tokenizers, embeddings, model fine-tuning).
- Vector databases (Milvus, FAISS, Pinecone, ElasticSearch).
- Prompt engineering, function/tool calling, JSON schema outputs.
Data & Tools:
- SQL fundamentals; exposure to data wrangling and pipelines.
- Git/GitHub, Jupyter, basic Docker.
WHAT ARE WE LOOKING FOR?
- Solid academic foundation with strong applied ML/DL exposure.
- Curiosity to learn cutting-edge AI and willingness to experiment.
- Clear communicator who can explain ML/LLM trade-offs simply.
- Strong problem-solving and ownership mindset.
MINIMUM QUALIFICATIONS:
- Doctorate (Academic) Degree and 2 years related work experience; Master's Level Degree and related work experience of 5 years; Bachelor's Level Degree and related work experience of 7 years in building AI systems/solutions with Machine Learning, Deep Learning, and LLMs.
MUST-HAVES:
- Education/qualification: Preferably from premier Institute like IIT, IISC, IIIT, NIT and BITS. Also regional tier 1 colleges.
- Doctorate (Academic) Degree and 2 years related work experience; or Master's Level Degree and related work experience of 5 years; or Bachelor's Level Degree and related work experience of 7 years
- Min 5 yrs experience in the Mandatory Skills: Python, Deep Learning, Machine Learning, Algorithm Development and Image Processing
- 3.5 to 4 yrs proficiency with PyTorch or TensorFlow/Keras
- Candidates from engineering product companies have higher chances of getting shortlisted (current company or past experience)
QUESTIONNAIRE:
Do you have at least 5 years of experience with Python, Deep Learning, Machine Learning, Algorithm Development, and Image Processing? Please mention the skills and years of experience:
Do you have experience with PyTorch or TensorFlow / Keras?
- PyTorch
- TensorFlow / Keras
- Both
How many years of experience do you have with PyTorch or TensorFlow / Keras?
- Less than 3 years
- 3 to 3.5 years
- 3.5 to 4 years
- More than 4 years
Is the candidate willing to relocate to Chennai?
- Ready to relocate
- Based in Chennai
What type of company have you worked for in your career?
- Service-based IT company
- Product company
- Semiconductor company
- Hardware manufacturing company
- None of the above

Requirements / Experience:
- 1-3+ years of PHP application development experience
- Strong Experience using Laravel/CodeIgniter/ Symfonyframeworks
- Strong experience and theoretical knowledge of PHP
- Front end and back end development expertise
- JavaScript (Vue.js, React, node, Typescript, RequireJS)
- Front end frameworks (Bootstrap, Tailwind, etc.)
- Good background with SQL Databases (writing queries, optimization, query builders usage, ORMs)
- API development and documentation (OpenAPI)
- Troubleshooting (ability to react in the most effective manner to eliminate technical issues)
- Solution-Making (ability to propose the most effective technical solutions)
- NoSQL experience is nice to have
- Ability to work on existing live projects and do the modifications as required.
- Ability to work independently and in a team as well
Hi All,
*Urgent Hiring*
Role: IT Manager
Exp: 10+
Shift: 8 AM to 5 PM (Monday to Saturday & Alternate Saturdays are off)
Location- Delhi, Daryaganj
Must have experience in a broking firm
Mandatory skill: COLOCATION (on NSE & BSE Exchange)
Worked on ESXi 5.0 &vCenter 5.0. Cloning, Creation, and Management of Virtual Machine.
• Done P2V & V2V conversion. vMotion and Storage vMotion.
• Ensured management and troubleshooting of Arbitrage setup with 120 users working in Exchanges NSE (Automated & Manual), BSE, MCX, ACE & NCDEX
Position Overview: We are seeking a talented Data Engineer with expertise in Power BI to join our team. The ideal candidate will be responsible for designing and implementing data pipelines, as well as developing insightful visualizations and reports using Power BI. Additionally, the candidate should have strong skills in Python, data analytics, PySpark, and Databricks. This role requires a blend of technical expertise, analytical thinking, and effective communication skills.
Key Responsibilities:
- Design, develop, and maintain data pipelines and architectures using PySpark and Databricks.
- Implement ETL processes to extract, transform, and load data from various sources into data warehouses or data lakes.
- Collaborate with data analysts and business stakeholders to understand data requirements and translate them into actionable insights.
- Develop interactive dashboards, reports, and visualizations using Power BI to communicate key metrics and trends.
- Optimize and tune data pipelines for performance, scalability, and reliability.
- Monitor and troubleshoot data infrastructure to ensure data quality, integrity, and availability.
- Implement security measures and best practices to protect sensitive data.
- Stay updated with emerging technologies and best practices in data engineering and data visualization.
- Document processes, workflows, and configurations to maintain a comprehensive knowledge base.
Requirements:
- Bachelor’s degree in Computer Science, Engineering, or related field. (Master’s degree preferred)
- Proven experience as a Data Engineer with expertise in Power BI, Python, PySpark, and Databricks.
- Strong proficiency in Power BI, including data modeling, DAX calculations, and creating interactive reports and dashboards.
- Solid understanding of data analytics concepts and techniques.
- Experience working with Big Data technologies such as Hadoop, Spark, or Kafka.
- Proficiency in programming languages such as Python and SQL.
- Hands-on experience with cloud platforms like AWS, Azure, or Google Cloud.
- Excellent analytical and problem-solving skills with attention to detail.
- Strong communication and collaboration skills to work effectively with cross-functional teams.
- Ability to work independently and manage multiple tasks simultaneously in a fast-paced environment.
Preferred Qualifications:
- Advanced degree in Computer Science, Engineering, or related field.
- Certifications in Power BI or related technologies.
- Experience with data visualization tools other than Power BI (e.g., Tableau, QlikView).
- Knowledge of machine learning concepts and frameworks.
Job Description
What will you do at Skillovilla ?
- Learn & understand Skillovillas products, prospective clients & services well.
- Understand customer needs, and requirements and correlate solutions and customer requirements
- Identify and qualify new customers coming from our inbound and outbound lead funnel.
- Developing strong relationships with customers, connecting with key business executives and stakeholders.
- Document all pertinent customer information and conversations in the CRM system
- Respond, engage and qualify inbound/outbound leads and inquiries.
- Execute planned sales activities and develop a target list of high-potential new customers.
- Achieve monthly & quarterly quotas.
- Perform effective online demos to prospects.
What do we need from you?
- Exceptional communication skills, both oral and written, coupled with excellent listening skills and a positive and energetic phone presence (Hindi, English Mandatory, knowing regional languages will be a plus).
- Ability to make focused efforts to win customers.
- The ideal candidate should be strategy-driven with a research-based approach.
- Ability to multitask, prioritize and manage time effectively.
Academic Qualification : Any Graduation/ Post Graduation (completed and pursuing final year students).
Location : Bangalore (Work from Office)








