Loading...

{{notif_text}}

The next CutShort event, {{next_cs_event.name}}, in partnership with UpGrad, will happen on {{next_cs_event.startDate | date: 'd MMMM'}}The next CutShort event, {{next_cs_event.name}}, in partnership with UpGrad, will begin in a few hoursThe CutShort event, {{next_cs_event.name}}, in partnership with UpGrad, is LIVE.Join now!
{{hours_remaining}}:{{minutes_remaining}}:{{seconds_remaining}}Want to save 90% of your recruiting time? Learn how in our next webinar on 22nd March at 3 pmLearn more

Big data Jobs in Pune

Explore top Big data Job opportunities in Pune for Top Companies & Startups. All jobs are added by verified employees who can be contacted directly below.

Senior Software Engineer

Founded 2013
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai, Pune, Bengaluru (Bangalore)
Experience icon
5 - 12 years
Experience icon
18 - 30 lacs/annum

Responsibilities: You will interact directly with colleagues across all responsibility areas and Director Of Engineering. The successful candidate for this position: - Designs and implements well-architected and scalable solutions - Collaborate with various teams in releasing high-quality software - Performs code reviews and contributes to healthy coding conventions - Assists in integration with customer systems - Provides timely responses to internal technical questions - Demonstrates leadership skills in navigating through tense periods and keeping calm Our Culture: - Integrity and motivation is more important than skill and experience - Cross-company team building and collaboration - Diverse background and highly talented & passionate group of individuals Ideal Candidate: The ideal candidate is a senior engineer having substantial development experience and high standards for code quality & maintainability. Basic Qualifications: - 4-year degree in Computer Science or Computer Engineering Preferred Qualifications: - 5+ years of development experience - Experience in Java or Scala - Experience with all parts of SDLC including CI/CD and testing methodologies - Experience in working with NoSQL technologies and message queue management - Self-motivated and able to work with minimum guidance. - Experience in a startup or rapid-growth product or project - Comfortable with modern version control, and agile development Bonus Points: - Experience in working with micro-services, containers or big data technologies - Working knowledge of cloud technologies like GCE and AWS - Writes blog posts and has a strong record on StackOverflow and similar sites

Job posted by
apply for job
apply for job
Phagun Baya picture
Phagun Baya
Job posted by
Phagun Baya picture
Phagun Baya
Apply for job
apply for job

Expert Elasticsearch

Founded 2010
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
4 - 10 years
Experience icon
8 - 25 lacs/annum

<h3>Job Profile</h3> Hands-on experience working with elastic search 5.x or 2.x Hands-on experience programming in Python or Node JS  Understand product requirements and map them to leverage relevant elastic search features In depth understanding of analyzers, mappers, nested queries, aggregations, synonyms, significant terms etc In depth understanding of scoring (plus function score/custom scripting)  Experience in handling large indexes, sharding and maintaining production level clusters Experience working with add-on tools like Kibana, Logstash, Graph, Machine Learning etc will be an added advantage <h3>Required experience</h3> 2 to 7 years  <h3>Required qualification</h3> Strong foundation in computer science, with strong competencies in data structures, algorithms, and software design. Bachelors or Masters Degree in Computer Science or Engineering.

Job posted by
apply for job
apply for job
Abhijit Puri picture
Abhijit Puri
Job posted by
Abhijit Puri picture
Abhijit Puri
Apply for job
apply for job

Ruby on Rails Developer

Founded 2007
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Navi Mumbai, Pune
Experience icon
3 - 6 years
Experience icon
5 - 9 lacs/annum

Dear Candidate, Please find below details : Ruby on Rails Developer Years of experience- 3 to 6 years Required Skills Ruby, Ruby on Rails, Experience in developing Web application using Ruby, RoR Databases: PostgreSQL Added advantages if candidates knows REST OS: Linux Please share your details across anshuman.baghel@niyuj.com with below details  Total Exp: Rel Exp: Current CTC: Expected CTC: Notice Period: Niyuj is a product engineering company that engages with the customer at different levels in the product development lifecycle in order to build quality products, on budget and on time. Founded in 2007 by passionate technology leader, Stable and seasoned leadership with hands-on experience working or consulting companies from bootstrapped start-ups to large multinationals. Global experience in US, Australia & India, Worked with fortune 500 companies to prominent startups, clients include Symantec, Vmware, Carbonite, Edgewater networks Domain Areas we work for : CLOUD SERVICES - Enterprises are rushing to incorporate cloud computing, big data, and mobile into their IT infrastructures. BIG-DATA ANALYTICS - Revolutionizing the way Fortune 1000 companies harness billions of data and turn it into a competitive advantage. NETWORK AND SECURITY - Network and security-related system level work that meets customer demands and deliver real value Our Prime customer, Carbonite, is Americas #1 cloud backup and Storage Company, with over 1.5 million customers and headquartered in Boston MA, with offices in 15 locations across the world. Your potential for exponential growth: Your experience and expertise would be a great addition to our team, and you will have an opportunity to work closely with industry leaders, literally sitting across the table and jointly building the future with folks who are noted gurus and industry veterans from prestigious institutions like IIT's and top US universities with industry experience in fortune 500 companies like EMC, Symantec and VERITAS.

Job posted by
apply for job
apply for job
Anshuman Baghel picture
Anshuman Baghel
Job posted by
Anshuman Baghel picture
Anshuman Baghel
Apply for job
apply for job

Technical Lead - Big Data and Java

Founded 1997
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
2 - 7 years
Experience icon
1 - 20 lacs/annum

Description Does solving complex business problems and real world challenges interest you Do you enjoy seeing the impact your contributions make on a daily basis Are you passionate about using data analytics to provide game changing solutions to the Global 2000 clients Do you thrive in a dynamic work environment that constantly pushes you to be the best you can be and more Are you ready to work with smart colleagues who drive for excellence in everything they do If you possess a solutions mindset, strong analytical skills, and commitment to be part of a tremendous journey, come join our growing, global team. See what Saama can do for your career and for your journey. Position: Java/ Big Data Lead (2162) Location: Hinjewadi Phase 1, Pune Type: Permanent Full time Requirements: Candidate should be able - Define application level architecture and guide low level of Database design Gather technical requirements and propose solutions based on client s business and architectural needs Interact with prospective customers during product demos/ evaluations Internally work with technology and business groups to define project specifications Showcase experience on cloud based implementations and technically manage Bigdata and j2EE projects Showcase experience hands-on programming and debugging skills on Spring, Hibernate, Java, JavaScript, JSP/ Servlet, J2EE design patterns / Python Have knowledge on service Integration Concepts (especially with RESTFUL services/ SOAP based web services) Design and develop solutions for Non-Functional Requirements (Performance analysis & tuning, Benchmarking/ load testing, Security) Impact on the business: Plays an important role in making Saama s Solutions game changers for our strategic partners by using data science to solve core, complex business challenges. Key relationships: Sales & pre-sales Product management Engineering Client organization: account management & delivery Saama Competencies: INTEGRITY: we do the right things. INNOVATION: we change the game. TRANSPARENCY: we communicate openly COLLABORATION: we work as one team PROBLEM-SOLVING: we solve core, complex business challenges ENJOY & CELEBRATE: we have fun Competencies: Self-starter who gets results with minimal support and direction in a fast-paced environment. Takes initiative; challenges the status quo to drive change. Learns quickly; takes smart risks to experiment and learn. Works well with others; builds trust and maintains credibility. Planful: identifies and confirms key requirements in dynamic environments; anticipates tasks and contingencies. Communicates effectively; productive communication with clients and all key stakeholders communication in both verbal and written communication. Stays the course despite challenges & setbacks. Works well under pressure. Strong analytical skills; able to apply inductive and deductive thinking to generate solutions for complex problems

Job posted by
apply for job
apply for job
Sandeep Chaudhary picture
Sandeep Chaudhary
Job posted by
Sandeep Chaudhary picture
Sandeep Chaudhary
Apply for job
apply for job

Data Scientist

Founded 1997
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
4 - 8 years
Experience icon
1 - 16 lacs/annum

Description Must have Direct Hands- on, 4 years of experience, building complex Data Science solutions Must have fundamental knowledge of Inferential Statistics Should have worked on Predictive Modelling, using Python / R Experience should include the following, File I/ O, Data Harmonization, Data Exploration Machine Learning Techniques (Supervised, Unsupervised) Multi- Dimensional Array Processing Deep Learning NLP, Image Processing Prior experience in Healthcare Domain, is a plus Experience using Big Data, is a plus Should have Excellent Analytical, Problem Solving ability. Should be able to grasp new concepts quickly Should be well familiar with Agile Project Management Methodology Should have excellent written and verbal communication skills Should be a team player with open mind

Job posted by
apply for job
apply for job
Sandeep Chaudhary picture
Sandeep Chaudhary
Job posted by
Sandeep Chaudhary picture
Sandeep Chaudhary
Apply for job
apply for job

Bigdata Lead

Founded 1997
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
2 - 5 years
Experience icon
1 - 18 lacs/annum

Description Deep experience and understanding of Apache Hadoop and surrounding technologies required; Experience with Spark, Impala, Hive, Flume, Parquet and MapReduce. Strong understanding of development languages to include: Java, Python, Scala, Shell Scripting Expertise in Apache Spark 2. x framework principals and usages. Should be proficient in developing Spark Batch and Streaming job in Python, Scala or Java. Should have proven experience in performance tuning of Spark applications both from application code and configuration perspective. Should be proficient in Kafka and integration with Spark. Should be proficient in Spark SQL and data warehousing techniques using Hive. Should be very proficient in Unix shell scripting and in operating on Linux. Should have knowledge about any cloud based infrastructure. Good experience in tuning Spark applications and performance improvements. Strong understanding of data profiling concepts and ability to operationalize analyses into design and development activities Experience with best practices of software development; Version control systems, automated builds, etc. Experienced in and able to lead the following phases of the Software Development Life Cycle on any project (feasibility planning, analysis, development, integration, test and implementation) Capable of working within the team or as an individual Experience to create technical documentation

Job posted by
apply for job
apply for job
Sandeep Chaudhary picture
Sandeep Chaudhary
Job posted by
Sandeep Chaudhary picture
Sandeep Chaudhary
Apply for job
apply for job

Big Data Architect

Founded 1997
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
2 - 5 years
Experience icon
1 - 30 lacs/annum

Description Does solving complex business problems and real world challenges interest you Do you enjoy seeing the impact your contributions make on a daily basis Are you passionate about using data analytics to provide game changing solutions to the Global 2000 clients Do you thrive in a dynamic work environment that constantly pushes you to be the best you can be and more Are you ready to work with smart colleagues who drive for excellence in everything they do If you possess a solutions mind set , strong technological expertise , and commitment to be part of a tremendous journey , come join our growing , global team. See what Saama can do for your career and for your journey. Impact on the business: Candidate would play a key role in delivering success by leveraging Web and Big Data technologies and tools to fulfill client s business objectives. Responsibilities: Participate in requirement gathering sessions with Business users and stakeholders to understand the business needs. Understand functional and non - functional requirements and define technical Architecture and design to cater to the same. Produce a detailed technical design document to match the solution design specifications. Review and validate effort estimates produced by development team for design and build phases. Understand and apply company s solutions / frameworks to the design when needed. Collaborate with the development team to produce a technical specification for custom development and systems integration requirements. Participate and lead , when needed , the project meetings with the customer. Collaborate with senior architects in customer organization and convince / defend design and architecture decisions for the project. Be technical mentor to the development team. Required Skills Experience in designing scalable complex distributed systems. Hands on development experience in Big Data Hadoop ecosystem & Analytics space Experience working with Cloud Storage solutions in AWS , Azure etc. MS / BS degree in Computer Science , Mathematics , Engineering or related field. 12 years of experience as a technology leader designing and developing data architecture solutions with more than 2 years specializing in big data architecture or data analytics. Experience of implementing solutions using Big data technologies - Hadoop , Map / Reduce , Pig , Hive , Spark , Storm , Impala , Oozie , Flume , ZooKeeper , Sqoop etc Good understanding of NoSQL and prior experience working with NoSQL databases Hbase , MongoDB , Cassandra , Competencies: Self - starter who gets results with minimal support and direction in a fast - paced environment. Takes initiative; challenges the status quo to drive change. Learns quickly; takes smart risks to experiment and learn. Works well with others; builds trust and maintains credibility. Identifies and confirms key requirements in dynamic environments; anticipates tasks and contingencies. Strong analytical skills; able to apply creative thinking to generate solutions for complex problems Communicates effectively; productive communication with clients and all key stakeholders (both verbal and written communication).

Job posted by
apply for job
apply for job
Sandeep Chaudhary picture
Sandeep Chaudhary
Job posted by
Sandeep Chaudhary picture
Sandeep Chaudhary
Apply for job
apply for job

Senior Software Engineer - Backend

Founded 2015
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
5 - 10 years
Experience icon
17 - 25 lacs/annum

Responsibilities Ensure timely and top-quality product delivery Ensure that the end product is fully and correctly defined and documented Ensure implementation/continuous improvement of formal processes to support product development activities Drive the architecture/design decisions needed to achieve cost-effective and high-performance results Conduct feasibility analysis, produce functional and design specifications of proposed new features. · Provide helpful and productive code reviews for peers and junior members of the team. Troubleshoot complex issues discovered in-house as well as in customer environments. Qualifications · Strong computer science fundamentals in algorithms, data structures, databases, operating systems, etc. · Expertise in Java, Object Oriented Programming, Design Patterns · Experience in coding and implementing scalable solutions in a large-scale distributed environment · Working experience in a Linux/UNIX environment is good to have · Experience with relational databases and database concepts, preferably MySQL · Experience with SQL and Java optimization for real-time systems · Familiarity with version control systems Git and build tools like Maven · Excellent interpersonal, written, and verbal communication skills · BE/B.Tech./M.Sc./MCS/MCA in Computers or equivalent

Job posted by
apply for job
apply for job
Sourabh Gandhe picture
Sourabh Gandhe
Job posted by
Sourabh Gandhe picture
Sourabh Gandhe
Apply for job
apply for job

Sr Java Application Developer for Enterprise Data Analytics Platform

Founded 2014
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
3 - 7 years
Experience icon
4 - 12 lacs/annum

Must have skills: -Very strong coding skills on Core Java (1.5 and above) -Should be able to analyze complex code structures, data structures, algorithms/logic -Should have hands on knowledge of working on Java -Multithreading (juml)programs -Should have expertise in Java Collection framework -Must have good exposure on Struts/JSP services/Jquery/Ajax, Json-based UI rendering Good to have skills (not mandatory): -Good working knowledge on Java script/Jquery framework -Should have used HTML5/CSS5/Node.js/D3 framework in atleast one of the projects earlier -Hands on latest technologies like Cassandra, Solr, Hadoop would be an advantage -Knowledge on Graph structures would be desirable

Job posted by
apply for job
apply for job
Neha Ambastha picture
Neha Ambastha
Job posted by
Neha Ambastha picture
Neha Ambastha
Apply for job
apply for job

DevOps Head

Founded 2009
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
8 - 12 years
Experience icon
5 - 15 lacs/annum

DevOps Architect, responsible for designing & implementing the Devops related work task and clarify the System/Deployment related issue directly with customer

Job posted by
apply for job
apply for job
Pankaj Gajjar picture
Pankaj Gajjar
Job posted by
Pankaj Gajjar picture
Pankaj Gajjar
Apply for job
apply for job

Technical Architect

Founded 2009
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
3 - 15 years
Experience icon
2 - 12 lacs/annum

Work on different POC Experience in Java/J2ee programming and coding. many more ..

Job posted by
apply for job
apply for job
Pankaj Gajjar picture
Pankaj Gajjar
Job posted by
Pankaj Gajjar picture
Pankaj Gajjar
Apply for job
apply for job

DevOps Engineer

Founded 2009
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
3 - 10 years
Experience icon
6 - 20 lacs/annum

Strong background in Linux/Unix Administration. • Experience with CI Tools Like Jenkins etc • Experience with automation/configuration management using either Docker, Puppet, Ansible, Chef or an equivalent • Build, release and configuration management of production systems. • System troubleshooting and problem solving across platform and application domains. • Deploying, automating, maintaining and managing AWS cloud-based production system, to ensure the availability, performance, scalability and security of productions systems. • Pre-production Acceptance Testing to help assure the quality of our products / services. • Evaluate new technology options and vendor products. • Strong experience with SQL and MySQL (NoSQL experience is an addon) • Suggesting architecture improvements, recommending process improvements. • Understanding of cloud-based services, knowledge w.r.t Hosting e.g.: Amazon AWS etc Ensuring critical system security through using best in class cloud security solutions. • Ability to use a wide variety of open source technologies and cloud services (experience with AWS is an addon) • A working understanding of code and script (PHP, Python, Perl and/or Ruby) will be an addon. • Knowledge of Ant, Maven or other Build and Release tools will be an addon. • AWS: 2+ years’ experience with using a broad range of AWS technologies (e.g. EC2, RDS, ELB, EBD, S3, VPC, Glacier, IAM, CloudWatch, KMS) to develop and maintain Amazon AWS based cloud solution, with an emphasis on best practice cloud security. • DevOps: Solid experience as a DevOps Engineer in a 24x7 uptime Amazon AWS environment, including automation experience with configuration management tools. • Monitoring Tools: Experience with system monitoring tools (e.g. Nagios, Zabbix etc)

Job posted by
apply for job
apply for job
Pankaj Gajjar picture
Pankaj Gajjar
Job posted by
Pankaj Gajjar picture
Pankaj Gajjar
Apply for job
apply for job

Technical Architect

Founded 2006
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
9 - 20+ years
Experience icon
13 - 25 lacs/annum

The hunt is for a AWS BigData /DWH Architect with the ability to manage effective relationships with a wide range of stakeholders (customers & team members alike). Incumbent will demonstrate personal commitment and accountability to ensure standards are continuously sustained and improved both within the internal teams, and with partner organizations and suppliers. We at Nitor Infotech a Product Engineering Services company are always on hunt for some best talents in the IT industry & keeping with our trend of What next in IT. We are scouting for result oriented resources with passion for product, technology services, and creating great customer experiences. Someone who can take our current expertise & footprint of Nitor Infotech Inc., to an altogether different dimension & level in tune with the emerging market trends and ensure Brilliance @ Work continues to prevail in whatever we do. Nitor Infotech works with global ISVs to help them build and accelerate their product development. Nitor is able to do so because of the fact that product development is its DNA. This DNA is enriched by its 10 years of expertise, best practices and frameworks & Accelerators. Because of this ability Nitor Infotech has been able to build business relationships with product companies having revenues from $50 Million to $1 Billion. • 7-12+ years of relevant experience of working in Database, BI and Analytics space with over 0-2 yrs of architecting and designing data warehouse experience including 2 to 3 yrs in Big Data ecosystem • Experience in data warehouse design in AWS • Strong architecting, programming, design skills and proven track record of architecting and building large scale, distributed big data solutions • Professional and technical advice on Big Data concepts and technologies, in particular highlighting the business potential through real-time analysis • Provides technical leadership in Big Data space (Hadoop Stack like M/R, HDFS, Pig, Hive, HBase, Flume, Sqoop, etc. NoSQL stores like Mongodb, Cassandra, HBase etc.) • Performance tuning of Hadoop clusters and Hadoop MapReduce routines. • Evaluate and recommend Big Data technology stack for the platform • Drive significant technology initiatives end to end and across multiple layers of architecture • Should have breadth of BI knowledge which includes:  MSBI, Database design, new visualization tools like Tableau, Qlik View, Power BI  Understand internals and intricacies of Old and New DB platform which includes:  Strong RDMS DB Fundamentals either of it SQL Server/ MySQL/ Oracle  DB and DWH design  Designing Semantic Model using OLAP and Tabular model using MS and Non MS tools  No SQL DBs including Document, Graph, Search and Columnar DBs • Excellent communication skills and strong ability to build good rapport with prospect and existing customers • Be a Mentor and go to person for Jr. team members in the team Qualification & Experience: · Educational qualification: BE/ME/B.Tech/M.Tech, BCA/MCA/BCS/MCS, any other degree with relevant IT qualification.

Job posted by
apply for job
apply for job
Balakumar Mohan picture
Balakumar Mohan
Job posted by
Balakumar Mohan picture
Balakumar Mohan
Apply for job
apply for job

Python Developer

Founded 2015
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
2 - 5 years
Experience icon
5 - 10 lacs/annum

We are an early stage startup working in the space of analytics, big data, machine learning, data visualization on multiple platforms and SaaS. We have our offices in Palo Alto and WTC, Kharadi, Pune and got some marque names as our customers. We are looking for really good Python programmer who MUST have scientific programming experience (Python, etc.) Hands-on with numpy and the Python scientific stack is a must. Demonstrated ability to track and work with 100s-1000s of files and GB-TB of data. Exposure to ML and Data mining algorithms. Need to be comfortable working in a Unix environment and SQL. You will be required to do following: Using command line tools to perform data conversion and analysis Supporting other team members in retrieving and archiving experimental results Quickly writing scripts to automate routine analysis tasks Creating insightful, simple graphics to represent complex trends Explore/design/invent new tools and design patterns to solve complex big data problems Experience working on a long-term, lab-based project (academic experience acceptable)

Job posted by
apply for job
apply for job
Nischal Vohra picture
Nischal Vohra
Job posted by
Nischal Vohra picture
Nischal Vohra
Apply for job
apply for job

Power Business Intelligent Developer

Founded 2014
Products and services{{j_company_types[2 - 1]}}
{{j_company_sizes[1 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune, Kharadi
Experience icon
2 - 5 years
Experience icon
4 - 7 lacs/annum

Should be able to create AWesome Dashboards - Should have hands on knowledge on all of the following: > Visualizations > Datasets > Reports > Dashboards > Tiles - Excellent Querying Skills using TSQL. - Should have prior exposure to SSRS and/or SSAS - Working knowledge of Microsoft Power Pivot, Power View, and Power BI Desktop.

Job posted by
apply for job
apply for job
Yogita Purandare picture
Yogita Purandare
Job posted by
Yogita Purandare picture
Yogita Purandare
Apply for job
apply for job

Freelance Faculty

Founded 2009
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Anywhere, United States, Canada
Experience icon
3 - 10 years
Experience icon
2 - 10 lacs/annum

To introduce myself I head Global Faculty Acquisition for Simplilearn. About My Company: SIMPLILEARN is a company which has transformed 500,000+ carriers across 150+ countries with 400+ courses and yes we are a Registered Professional Education Provider providing PMI-PMP, PRINCE2, ITIL (Foundation, Intermediate & Expert), MSP, COBIT, Six Sigma (GB, BB & Lean Management), Financial Modeling with MS Excel, CSM, PMI - ACP, RMP, CISSP, CTFL, CISA, CFA Level 1, CCNA, CCNP, Big Data Hadoop, CBAP, iOS, TOGAF, Tableau, Digital Marketing, Data scientist with Python, Data Science with SAS & Excel, Big Data Hadoop Developer & Administrator, Apache Spark and Scala, Tableau Desktop 9, Agile Scrum Master, Salesforce Platform Developer, Azure & Google Cloud. : Our Official website : www.simplilearn.com If you're interested in teaching, interacting, sharing real life experiences and passion to transform Careers, please join hands with us. Onboarding Process • Updated CV needs to be sent to my email id , with relevant certificate copy. • Sample ELearning access will be shared with 15days trail post your registration in our website. • My Subject Matter Expert will evaluate you on your areas of expertise over a telephonic conversation - Duration 15 to 20 minutes • Commercial Discussion. • We will register you to our on-going online session to introduce you to our course content and the Simplilearn style of teaching. • A Demo will be conducted to check your training style, Internet connectivity. • Freelancer Master Service Agreement Payment Process : • Once a workshop/ Last day of the training for the batch is completed you have to share your invoice. • An automated Tracking Id will be shared from our automated ticketing system. • Our Faculty group will verify the details provided and share the invoice to our internal finance team to process your payment, if there are any additional information required we will co-ordinate with you. • Payment will be processed in 15 working days as per the policy this 15 days is from the date of invoice received. Please share your updated CV to get this for next step of on-boarding process.

Job posted by
apply for job
apply for job
STEVEN JOHN picture
STEVEN JOHN
Job posted by
STEVEN JOHN picture
STEVEN JOHN
Apply for job
apply for job

freelance trainers

Founded 2015
Products and services{{j_company_types[2 - 1]}}
{{j_company_sizes[1 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Anywhere
Experience icon
8 - 11 years
Experience icon
5 - 10 lacs/annum

We are a team with a mission, A mission to create and deliver great learning experiences to engineering students through various workshops and courses. If you are an industry professional and :- See great scope of improvement in higher technical education across the country and connect with our purpose of impacting it for good. Keen on sharing your technical expertise to enhance the practical learning of students. You are innovative in your ways of creating content and delivering them. You don’t mind earning few extra bucks while doing this in your free time. Buzz us at info@monkfox.com and let us discuss how together we can take technological education in the country to new heights.

Job posted by
apply for job
apply for job
Tanu Mehra picture
Tanu Mehra
Job posted by
Tanu Mehra picture
Tanu Mehra
Apply for job
apply for job
Why apply via CutShort?
Connect with actual hiring teams and get their fast response. No 3rd party recruiters. No spam.