Title | : | Intuition behind Latent Dirichlet Allocation (LDA) for Topic Modeling |
Lasting | : | 6.01 |
Date of publication | : | |
Views | : | 142 rb |
|
how can random assignment of topics lead to correct assignment of topics? Comment from : @preethamkumar91 |
|
Great videobrbrJust a quick correction, in @1:11 you have written that "Stemming: merging words that are equivalent in meaning" That actually is called LemmatizationbrbrStemming actually reduces the words to their radicals, since the topic analysis can be done with just them, reducing the size of the analyzed data Comment from : @VictorAlmeida27 |
|
Well explainedsimple and clear! Thank you! Comment from : @mariarocque7384 |
|
Useful Thank you! Comment from : @Juan-Hdez |
|
1000th like Comment from : @parthicle |
|
Observations are referred to as words The feature set is referred to as vocabulary A feature is referred to as a document And the resulting categories are referred to as topicsbrbrIs this correct understanding?brplease correct me if I am wrong Comment from : @rajaramwalavalkar9187 |
|
ur way of speaking gives me a gamer vibe bro Comment from : @Admin_REX |
|
Great explanation! What type of inference is this? (Gibbs sampling, variational Bayes, etc) Comment from : @michaelyoder7250 |
|
always indian man always Comment from : @hence0182 |
|
this is not clear why hes subtracting it 3:10 Comment from : @monamyers8324 |
|
What's the significance of the word 'dynamic' in the slide VII? Comment from : @krishnagarg6870 |
|
woow woow it was very helpful, as I am working on this algo Comment from : @_xkim00 |
|
Thank you so much Great video you used such a simple example to explain this LDA, wish I saw your video long time ago thanks again Comment from : @robindong3802 |
|
Man, I lost you mid-way in 'topics' but anyway I could get an idea what LDA actually is Thanks :) Comment from : @akshaygera9097 |
|
Under this algorithm, a word would never be reassigned to the same topic again? Comment from : @abcd12272 |
|
How was "World Cup" reassigned to topic 1 based on the multiplication of the 2 matrices? Comment from : @abcd12272 |
|
Thank you for the cleanest and simplest explanation Comment from : @shagshaq |
|
Sir I had done LDA using Scikitlearn library brWhen should we use Gensim Library or anything is same Comment from : @consistentthoughts826 |
|
Slide no 8 please Comment from : @ravindarmadishetty736 |
|
Very confusing Comment from : @snandi1603 |
|
You explained really well But try to elaborate the explanation so that it can be understood in one go Going over the video once again is cumbersome Comment from : @parthaprateempatra4278 |
|
Thank you for this video! Clearly explained I would request you for an video on how to perform dirichlet regression using R or python Thank you Comment from : @priyanatraj5634 |
|
Hi Bhavesh, can you please explain how the area rather the probabilities are calculated Comment from : @sudeshnadutta5702 |
|
You started well, with good examples but at mid and at end, it was difficult for a newbie of this field like me to understand Comment from : @romy5994 |
|
what's the corpus argument passed ? Comment from : @vibewithalexa |
|
Thank you so much Precise and clear explanation !! Comment from : @rajsinghmaan3095 |
|
couldn't understand- how much doc like topic* how much topic like word!! Comment from : @arpitqw1 |
|
Please make a video on KL divergence, it will be a great help brregards Comment from : @ash_engineering |
|
Best explaination ever continue !!! Comment from : @fitnesscoach7 |
|
clear and crisp Comment from : @SajeedSk |
|
the matrix improvement part could be explained better but definitely the best video on the topic(no pun intended) Thanks Comment from : @VishalSingh-dl8oy |
|
really you are the best regards Comment from : @Aliabbashassan3402 |
|
Quick question regarding 2:32, you point out that a word is associated with multiple topics? I thought a word can only be associated with a one topic, while a document can be associated with multiple topics Comment from : @MasayoMusic |
|
So one iteration of the algorithm is the same as going through the document and reassigning the topic for each word of the document, and do that for all the documents? Would it be wrong if I did the iteration N times on a single doc and did those N iterations for each document? Does the order of operations matter here? Comment from : @fancypants7533 |
|
Tks Bhavesh, good work! Comment from : @TheEscolaris |
|
This is so helpful Thanks Bhavesh :) Comment from : @rohanchadha3506 |
|
hey thats a good work out there and can you please give a link or something for the presentation i will bw really helpful Comment from : @shubhammishra6687 |
|
This is by far the best explanation of LDA I went through literally dozens of videos and none of them explained the technical details Thank you for this video Comment from : @nishantjha6412 |
|
Wonderfully explained, I was reluctant to read the Andre Ng paper's theoretical and mathematical explanation on LDA, and this gave me the whole idea in just one go Great work! Comment from : @100damen |
![]() |
Latent Dirichlet Allocation (LDA) | Topic Modeling | Machine Learning РѕС‚ : TwinEd Productions Download Full Episodes | The Most Watched videos of all time |
![]() |
Topic modeling with latent dirichlet allocation (LDA) РѕС‚ : Statistics Ninja Download Full Episodes | The Most Watched videos of all time |
![]() |
What is Laten Dirichlet Allocation LDA (Topic Modeling for Digital Humanities 03.01) РѕС‚ : Python Tutorials for Digital Humanities Download Full Episodes | The Most Watched videos of all time |
![]() |
What is Latent Dirichlet Allocation (LDA) in Machine Learning? РѕС‚ : Data u0026 Analytics Download Full Episodes | The Most Watched videos of all time |
![]() |
Amazon SageMaker’s Built-in Algorithm Webinar Series: Latent Dirichlet Allocation (LDA) РѕС‚ : Amazon Web Services Download Full Episodes | The Most Watched videos of all time |
![]() |
Introduction to Latent Dirichlet Allocation (LDA) | DS ML Algorithms РѕС‚ : V1shwanath Download Full Episodes | The Most Watched videos of all time |
![]() |
Latent Dirichlet Allocation (LDA) Video3 РѕС‚ : Sanjay Prasad Download Full Episodes | The Most Watched videos of all time |
![]() |
Latent Dirichlet Allocation LDA in Azure ML РѕС‚ : Mitchell R. Wenger (Accounting Systems u0026 Tech) Download Full Episodes | The Most Watched videos of all time |
![]() |
Latent Dirichlet Allocation (LDA) with Gibbs Sampling Explained РѕС‚ : Aladdin Persson Download Full Episodes | The Most Watched videos of all time |
![]() |
Latent Dirichlet Allocation (Part 1 of 2) РѕС‚ : Serrano.Academy Download Full Episodes | The Most Watched videos of all time |