sigmoid.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A social space for people researching, working with, or just interested in AI!

Server stats:

594
active users

#ise2025

1 post1 participant1 post today
Harald Sack<p>In the ISE2025 lecture today, our students learned about unsupervised learning on the example of k-Means clustering. One nice hands-on example is image colour reduction based on k-means clustering, as demonstrated in a colab notebook (based on the Python DataScience Handbook by Vanderplus)</p><p>colab notebook: <a href="https://colab.research.google.com/drive/1lhdq2pynuwJKoXbspydECuWcPRw3-xxn?usp=sharing" target="_blank" rel="nofollow noopener" translate="no"><span class="invisible">https://</span><span class="ellipsis">colab.research.google.com/driv</span><span class="invisible">e/1lhdq2pynuwJKoXbspydECuWcPRw3-xxn?usp=sharing</span></a><br />Python DataScience Handbook: <a href="https://archive.org/details/python-data-science-handbook.pdf/mode/2up" target="_blank" rel="nofollow noopener" translate="no"><span class="invisible">https://</span><span class="ellipsis">archive.org/details/python-dat</span><span class="invisible">a-science-handbook.pdf/mode/2up</span></a></p><p><a href="https://sigmoid.social/tags/ise2025" class="mention hashtag" rel="tag">#<span>ise2025</span></a> <a href="https://sigmoid.social/tags/lecture" class="mention hashtag" rel="tag">#<span>lecture</span></a> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention">@<span>sourisnumerique</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention">@<span>enorouzi</span></a></span> <a href="https://sigmoid.social/tags/datascience" class="mention hashtag" rel="tag">#<span>datascience</span></a> <a href="https://sigmoid.social/tags/machinelearning" class="mention hashtag" rel="tag">#<span>machinelearning</span></a> <a href="https://sigmoid.social/tags/AI" class="mention hashtag" rel="tag">#<span>AI</span></a> <span class="h-card" translate="no"><a href="https://social.kit.edu/@KIT_Karlsruhe" class="u-url mention">@<span>KIT_Karlsruhe</span></a></span></p>
Harald Sack<p>Tomorrow, we will dive deeper into ontologies with OWL, the Web Ontology Language. However, I&#39;m doing OWL-lectures now for almost 20 years - and OWL as well as the lecture haven&#39;t changed much. So, I&#39;m afraid I&#39;m going to surprise/dissapoint the students tomorrow, when I will switch off the presentation and start improvising a random OWL ontology with them on the blackboard ;-) </p><p><a href="https://sigmoid.social/tags/ise2025" class="mention hashtag" rel="tag">#<span>ise2025</span></a> <a href="https://sigmoid.social/tags/OWL" class="mention hashtag" rel="tag">#<span>OWL</span></a> <a href="https://sigmoid.social/tags/semanticweb" class="mention hashtag" rel="tag">#<span>semanticweb</span></a> <a href="https://sigmoid.social/tags/semweb" class="mention hashtag" rel="tag">#<span>semweb</span></a> <a href="https://sigmoid.social/tags/RDF" class="mention hashtag" rel="tag">#<span>RDF</span></a> <a href="https://sigmoid.social/tags/knowledgegraphs" class="mention hashtag" rel="tag">#<span>knowledgegraphs</span></a> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention">@<span>sourisnumerique</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention">@<span>enorouzi</span></a></span></p>
Sarven Capadisli<p><span class="h-card" translate="no"><a href="https://sigmoid.social/@lysander07" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>lysander07</span></a></span> What I like about this view is that info is not necessarily complete. Typical real-world data. So, some queries will not show everything there is to know about a topic while a lot of info may be for other things. Though it can be extended - "pay as you go".</p><p>Some students may also be interested in expressing their research output as a <a href="https://w3c.social/tags/KnowledgeGraph" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>KnowledgeGraph</span></a> on its own right - playing an important role in <a href="https://w3c.social/tags/ScholarlyCommunication" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ScholarlyCommunication</span></a></p><p>eg view: <a href="https://dokie.li/?graph=https://csarven.ca/linked-research-decentralised-web" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">dokie.li/?graph=https://csarve</span><span class="invisible">n.ca/linked-research-decentralised-web</span></a></p><p>seeAlso alt text.</p><p><a href="https://w3c.social/tags/ISE2025" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ISE2025</span></a></p>
Harald Sack<p>In today&#39;s ISE 2025 lecture,, we will introduce SPARQL as a query language for knowledge graphs. Again, I&#39;m trying out &#39;Dystopian Novels&#39; as example knowledge graph playground. Let&#39;s see, if the students might know any of them. Wtat do you think? ;-) </p><p><a href="https://sigmoid.social/tags/dystopia" class="mention hashtag" rel="tag">#<span>dystopia</span></a> <a href="https://sigmoid.social/tags/literature" class="mention hashtag" rel="tag">#<span>literature</span></a> <a href="https://sigmoid.social/tags/ise2025" class="mention hashtag" rel="tag">#<span>ise2025</span></a> <a href="https://sigmoid.social/tags/semanticweb" class="mention hashtag" rel="tag">#<span>semanticweb</span></a> <a href="https://sigmoid.social/tags/semweb" class="mention hashtag" rel="tag">#<span>semweb</span></a> <a href="https://sigmoid.social/tags/knowledgegraphs" class="mention hashtag" rel="tag">#<span>knowledgegraphs</span></a> <a href="https://sigmoid.social/tags/sparql" class="mention hashtag" rel="tag">#<span>sparql</span></a> <a href="https://sigmoid.social/tags/lecture" class="mention hashtag" rel="tag">#<span>lecture</span></a> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention">@<span>sourisnumerique</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention">@<span>enorouzi</span></a></span></p>
Harald Sack<p>Back in the lecture hall again after two exciting weeks of <a href="https://sigmoid.social/tags/ESWC2025" class="mention hashtag" rel="tag">#<span>ESWC2025</span></a> and <a href="https://sigmoid.social/tags/ISWS2025" class="mention hashtag" rel="tag">#<span>ISWS2025</span></a>. This morning, we introduced our students to RDF, RDFS, RDF Inferencing, and RDF Reification.</p><p><a href="https://sigmoid.social/tags/ise2025" class="mention hashtag" rel="tag">#<span>ise2025</span></a> <a href="https://sigmoid.social/tags/semanticweb" class="mention hashtag" rel="tag">#<span>semanticweb</span></a> <a href="https://sigmoid.social/tags/semweb" class="mention hashtag" rel="tag">#<span>semweb</span></a> <a href="https://sigmoid.social/tags/knowledgegraphs" class="mention hashtag" rel="tag">#<span>knowledgegraphs</span></a> <a href="https://sigmoid.social/tags/rdf" class="mention hashtag" rel="tag">#<span>rdf</span></a> <a href="https://sigmoid.social/tags/reasoning" class="mention hashtag" rel="tag">#<span>reasoning</span></a> <a href="https://sigmoid.social/tags/reification" class="mention hashtag" rel="tag">#<span>reification</span></a> <a href="https://sigmoid.social/tags/lecture" class="mention hashtag" rel="tag">#<span>lecture</span></a> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://mastodon.social/@KIT_Karlsruhe" class="u-url mention">@<span>KIT_Karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention">@<span>sourisnumerique</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention">@<span>enorouzi</span></a></span></p>
Europe Says<p><a href="https://www.europesays.com/2146877/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="">europesays.com/2146877/</span><span class="invisible"></span></a> South Korea’s dotmill Makes History Producing Content for Las Vegas Sphere’s Immersive LED Display <a href="https://pubeurope.com/tags/BLACKPINK" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>BLACKPINK</span></a> <a href="https://pubeurope.com/tags/bts" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>bts</span></a> <a href="https://pubeurope.com/tags/dotmill" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>dotmill</span></a> <a href="https://pubeurope.com/tags/GalaxyCorporation" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GalaxyCorporation</span></a> <a href="https://pubeurope.com/tags/ISE2025" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ISE2025</span></a> <a href="https://pubeurope.com/tags/LasVegasSphere" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LasVegasSphere</span></a> <a href="https://pubeurope.com/tags/PyeongChangWinterOlympics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>PyeongChangWinterOlympics</span></a> <a href="https://pubeurope.com/tags/SamsungElectronics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>SamsungElectronics</span></a> <a href="https://pubeurope.com/tags/SouthKorea" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>SouthKorea</span></a></p>
Harald Sack<p>Last week, we continued our <a href="https://sigmoid.social/tags/ISE2025" class="mention hashtag" rel="tag">#<span>ISE2025</span></a> lecture on distributional semantics with the introduction of neural language models (NLMs) and compared them to traditional statistical n-gram models. <br />Benefits of NLMs:<br />- Capturing Long-Range Dependencies<br />- Computational and Statistical Tractability<br />- Improved Generalisation<br />- Higher Accuracy</p><p><span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention">@<span>sourisnumerique</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention">@<span>enorouzi</span></a></span> <a href="https://sigmoid.social/tags/llms" class="mention hashtag" rel="tag">#<span>llms</span></a> <a href="https://sigmoid.social/tags/nlp" class="mention hashtag" rel="tag">#<span>nlp</span></a> <a href="https://sigmoid.social/tags/AI" class="mention hashtag" rel="tag">#<span>AI</span></a> <a href="https://sigmoid.social/tags/lecture" class="mention hashtag" rel="tag">#<span>lecture</span></a></p>
Harald Sack<p>In the <a href="https://sigmoid.social/tags/ISE2025" class="mention hashtag" rel="tag">#<span>ISE2025</span></a> lecture today we were introducing our students to the concept of distributional semantics as the foundation of modern large language models. Historically, Wittgenstein was one of the important figures in the Philosophy of Language stating thet &quot;The meaning of a word is its use in the language.&quot;</p><p><a href="https://static1.squarespace.com/static/54889e73e4b0a2c1f9891289/t/564b61a4e4b04eca59c4d232/1447780772744/Ludwig.Wittgenstein.-.Philosophical.Investigations.pdf" target="_blank" rel="nofollow noopener" translate="no"><span class="invisible">https://</span><span class="ellipsis">static1.squarespace.com/static</span><span class="invisible">/54889e73e4b0a2c1f9891289/t/564b61a4e4b04eca59c4d232/1447780772744/Ludwig.Wittgenstein.-.Philosophical.Investigations.pdf</span></a></p><p><a href="https://sigmoid.social/tags/philosophy" class="mention hashtag" rel="tag">#<span>philosophy</span></a> <a href="https://sigmoid.social/tags/wittgenstein" class="mention hashtag" rel="tag">#<span>wittgenstein</span></a> <a href="https://sigmoid.social/tags/nlp" class="mention hashtag" rel="tag">#<span>nlp</span></a> <a href="https://sigmoid.social/tags/AI" class="mention hashtag" rel="tag">#<span>AI</span></a> <a href="https://sigmoid.social/tags/llm" class="mention hashtag" rel="tag">#<span>llm</span></a> <a href="https://sigmoid.social/tags/languagemodel" class="mention hashtag" rel="tag">#<span>languagemodel</span></a> <a href="https://sigmoid.social/tags/language" class="mention hashtag" rel="tag">#<span>language</span></a> <a href="https://sigmoid.social/tags/lecture" class="mention hashtag" rel="tag">#<span>lecture</span></a> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention">@<span>enorouzi</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention">@<span>sourisnumerique</span></a></span> <a href="https://sigmoid.social/tags/AIart" class="mention hashtag" rel="tag">#<span>AIart</span></a></p>
Harald Sack<p>Generating Shakespeare-like text with an n-gram language model is straight forward and quite simple. But, don&#39;t expect to much of it. It will not be able to recreate a lost Shakespear play for you ;-) It&#39;s merely a parrot, making up well sounding sentences out of fragments of original Shakespeare texts...</p><p><a href="https://sigmoid.social/tags/ise2025" class="mention hashtag" rel="tag">#<span>ise2025</span></a> <a href="https://sigmoid.social/tags/lecture" class="mention hashtag" rel="tag">#<span>lecture</span></a> <a href="https://sigmoid.social/tags/nlp" class="mention hashtag" rel="tag">#<span>nlp</span></a> <a href="https://sigmoid.social/tags/llm" class="mention hashtag" rel="tag">#<span>llm</span></a> <a href="https://sigmoid.social/tags/languagemodel" class="mention hashtag" rel="tag">#<span>languagemodel</span></a> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention">@<span>enorouzi</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention">@<span>sourisnumerique</span></a></span> <a href="https://sigmoid.social/tags/shakespeare" class="mention hashtag" rel="tag">#<span>shakespeare</span></a> <a href="https://sigmoid.social/tags/generativeAI" class="mention hashtag" rel="tag">#<span>generativeAI</span></a> <a href="https://sigmoid.social/tags/statistics" class="mention hashtag" rel="tag">#<span>statistics</span></a></p>
Harald Sack<p>In our <a href="https://sigmoid.social/tags/ISE2025" class="mention hashtag" rel="tag">#<span>ISE2025</span></a> lecture last Wednesday, we learned how in n-gram language models via Markov assumption and maximum likelihood estimation we can predict the probability of the occurrence of a word given a specific context (i.e. n words previous in the sequence of words).</p><p><a href="https://sigmoid.social/tags/NLP" class="mention hashtag" rel="tag">#<span>NLP</span></a> <a href="https://sigmoid.social/tags/languagemodels" class="mention hashtag" rel="tag">#<span>languagemodels</span></a> <a href="https://sigmoid.social/tags/lecture" class="mention hashtag" rel="tag">#<span>lecture</span></a> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention">@<span>fizise</span></a></span> @tabea <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention">@<span>enorouzi</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention">@<span>sourisnumerique</span></a></span> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://mastodon.social/@KIT_Karlsruhe" class="u-url mention">@<span>KIT_Karlsruhe</span></a></span></p>
Harald Sack<p>This week, we were discussing the central question Can we &quot;predict&quot; a word? as the basis for statistical language models in our <a href="https://sigmoid.social/tags/ISE2025" class="mention hashtag" rel="tag">#<span>ISE2025</span></a> lecture. Of course, I wasx trying Shakespeare quotes to motivate the (international) students to complement the quotes with &quot;predicted&quot; missing words ;-)</p><p>&quot;All the world&#39;s a stage, and all the men and women merely....&quot;</p><p><a href="https://sigmoid.social/tags/nlp" class="mention hashtag" rel="tag">#<span>nlp</span></a> <a href="https://sigmoid.social/tags/llms" class="mention hashtag" rel="tag">#<span>llms</span></a> <a href="https://sigmoid.social/tags/languagemodel" class="mention hashtag" rel="tag">#<span>languagemodel</span></a> <a href="https://sigmoid.social/tags/Shakespeare" class="mention hashtag" rel="tag">#<span>Shakespeare</span></a> <a href="https://sigmoid.social/tags/AIart" class="mention hashtag" rel="tag">#<span>AIart</span></a> lecture <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention">@<span>enorouzi</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention">@<span>sourisnumerique</span></a></span> <a href="https://sigmoid.social/tags/brushUpYourShakespeare" class="mention hashtag" rel="tag">#<span>brushUpYourShakespeare</span></a></p>
Harald Sack<p>Last week, our students learned how to conduct a proper evaluation for an NLP experiment. To this end, we introduced a small textcorpus with sentences about Joseph Fourier, who counts as one of the discoverers of the greenhouse effect, responsible for global warming.</p><p><a href="https://github.com/ISE-FIZKarlsruhe/ISE-teaching/blob/b72690d38911b37748082256b61f96cf86171ace/materials/dataset/fouriercorpus.txt" target="_blank" rel="nofollow noopener" translate="no"><span class="invisible">https://</span><span class="ellipsis">github.com/ISE-FIZKarlsruhe/IS</span><span class="invisible">E-teaching/blob/b72690d38911b37748082256b61f96cf86171ace/materials/dataset/fouriercorpus.txt</span></a></p><p><a href="https://sigmoid.social/tags/ise2025" class="mention hashtag" rel="tag">#<span>ise2025</span></a> <a href="https://sigmoid.social/tags/nlp" class="mention hashtag" rel="tag">#<span>nlp</span></a> <a href="https://sigmoid.social/tags/lecture" class="mention hashtag" rel="tag">#<span>lecture</span></a> <a href="https://sigmoid.social/tags/climatechange" class="mention hashtag" rel="tag">#<span>climatechange</span></a> <a href="https://sigmoid.social/tags/globalwarming" class="mention hashtag" rel="tag">#<span>globalwarming</span></a> <a href="https://sigmoid.social/tags/historyofscience" class="mention hashtag" rel="tag">#<span>historyofscience</span></a> <a href="https://sigmoid.social/tags/climate" class="mention hashtag" rel="tag">#<span>climate</span></a> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention">@<span>enorouzi</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention">@<span>sourisnumerique</span></a></span></p>
Harald Sack<p>Last leg on our brief history of NLP (so far) is the advent of large language models with GPT-3 in 2020 and the introduction of learning from the prompt (aka few-shot learning).</p><p>T. B. Brown et al. (2020). Language models are few-shot learners. NIPS&#39;20</p><p><a href="https://proceedings.neurips.cc/paper/2020/file/1457c0d6bfcb4967418bfb8ac142f64a-Paper.pdf" target="_blank" rel="nofollow noopener" translate="no"><span class="invisible">https://</span><span class="ellipsis">proceedings.neurips.cc/paper/2</span><span class="invisible">020/file/1457c0d6bfcb4967418bfb8ac142f64a-Paper.pdf</span></a></p><p><a href="https://sigmoid.social/tags/llms" class="mention hashtag" rel="tag">#<span>llms</span></a> <a href="https://sigmoid.social/tags/gpt" class="mention hashtag" rel="tag">#<span>gpt</span></a> <a href="https://sigmoid.social/tags/AI" class="mention hashtag" rel="tag">#<span>AI</span></a> <a href="https://sigmoid.social/tags/nlp" class="mention hashtag" rel="tag">#<span>nlp</span></a> <a href="https://sigmoid.social/tags/historyofscience" class="mention hashtag" rel="tag">#<span>historyofscience</span></a> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention">@<span>enorouzi</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention">@<span>sourisnumerique</span></a></span> <a href="https://sigmoid.social/tags/ise2025" class="mention hashtag" rel="tag">#<span>ise2025</span></a></p>
Harald Sack<p>Next stop in our NLP timeline is 2013, the introduction of low dimensional dense word vectors - so-called &quot;word embeddings&quot; - based on distributed semantics, as e.g. word2vec by Mikolov et al. from Google, which enabled representation learning on text.</p><p>T. Mikolov et al. (2013). Efficient Estimation of Word Representations in Vector Space. <br /><a href="https://arxiv.org/abs/1301.3781" target="_blank" rel="nofollow noopener" translate="no"><span class="invisible">https://</span><span class="">arxiv.org/abs/1301.3781</span><span class="invisible"></span></a></p><p><a href="https://sigmoid.social/tags/NLP" class="mention hashtag" rel="tag">#<span>NLP</span></a> <a href="https://sigmoid.social/tags/AI" class="mention hashtag" rel="tag">#<span>AI</span></a> <a href="https://sigmoid.social/tags/wordembeddings" class="mention hashtag" rel="tag">#<span>wordembeddings</span></a> <a href="https://sigmoid.social/tags/word2vec" class="mention hashtag" rel="tag">#<span>word2vec</span></a> <a href="https://sigmoid.social/tags/ise2025" class="mention hashtag" rel="tag">#<span>ise2025</span></a> <a href="https://sigmoid.social/tags/historyofscience" class="mention hashtag" rel="tag">#<span>historyofscience</span></a> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention">@<span>sourisnumerique</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention">@<span>enorouzi</span></a></span></p>
Harald Sack<p>Building on the 90s, statistical n-gram language models, trained on vast text collections, became the backbone of NLP research. They fueled advancements in nearly all NLP techniques of the era, laying the groundwork for today&#39;s AI. </p><p>F. Jelinek (1997), Statistical Methods for Speech Recognition, MIT Press, Cambridge, MA</p><p><a href="https://sigmoid.social/tags/NLP" class="mention hashtag" rel="tag">#<span>NLP</span></a> <a href="https://sigmoid.social/tags/LanguageModels" class="mention hashtag" rel="tag">#<span>LanguageModels</span></a> <a href="https://sigmoid.social/tags/HistoryOfAI" class="mention hashtag" rel="tag">#<span>HistoryOfAI</span></a> <a href="https://sigmoid.social/tags/TextProcessing" class="mention hashtag" rel="tag">#<span>TextProcessing</span></a> <a href="https://sigmoid.social/tags/AI" class="mention hashtag" rel="tag">#<span>AI</span></a> <a href="https://sigmoid.social/tags/historyofscience" class="mention hashtag" rel="tag">#<span>historyofscience</span></a> <a href="https://sigmoid.social/tags/ISE2025" class="mention hashtag" rel="tag">#<span>ISE2025</span></a> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention">@<span>enorouzi</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention">@<span>sourisnumerique</span></a></span></p>
Harald Sack<p>Next stop on our NLP timeline (as part of the <a href="https://sigmoid.social/tags/ISE2025" class="mention hashtag" rel="tag">#<span>ISE2025</span></a> lecture) was Terry Winograd&#39;s SHRDLU, an early natural language understanding system developed in 1968-70 that could manipulate blocks in a virtual world. </p><p>Winograd, T. Procedures as a Representation for Data in a Computer Program for Understanding Natural Language. MIT AI Technical Report 235.<br /><a href="http://dspace.mit.edu/bitstream/handle/1721.1/7095/AITR-235.pdf" target="_blank" rel="nofollow noopener" translate="no"><span class="invisible">http://</span><span class="ellipsis">dspace.mit.edu/bitstream/handl</span><span class="invisible">e/1721.1/7095/AITR-235.pdf</span></a></p><p><a href="https://sigmoid.social/tags/nlp" class="mention hashtag" rel="tag">#<span>nlp</span></a> <a href="https://sigmoid.social/tags/lecture" class="mention hashtag" rel="tag">#<span>lecture</span></a> <a href="https://sigmoid.social/tags/historyofscience" class="mention hashtag" rel="tag">#<span>historyofscience</span></a> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention">@<span>sourisnumerique</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention">@<span>enorouzi</span></a></span> <a href="https://sigmoid.social/tags/AI" class="mention hashtag" rel="tag">#<span>AI</span></a></p>
Harald Sack<p>With the advent of ELIZA, Joseph Weizenbaum&#39;s first psychotherapist chatbot, NLP took another major step with pattern-based substitution algorithms based on simple regular expressions.</p><p>Weizenbaum, Joseph (1966). ELIZA—a computer program for the study of natural language communication between man and machine. Com. of the ACM. 9: 36–45.</p><p><a href="https://dl.acm.org/doi/pdf/10.1145/365153.365168" target="_blank" rel="nofollow noopener" translate="no"><span class="invisible">https://</span><span class="ellipsis">dl.acm.org/doi/pdf/10.1145/365</span><span class="invisible">153.365168</span></a></p><p><a href="https://sigmoid.social/tags/nlp" class="mention hashtag" rel="tag">#<span>nlp</span></a> <a href="https://sigmoid.social/tags/lecture" class="mention hashtag" rel="tag">#<span>lecture</span></a> <a href="https://sigmoid.social/tags/chatbot" class="mention hashtag" rel="tag">#<span>chatbot</span></a> <a href="https://sigmoid.social/tags/llm" class="mention hashtag" rel="tag">#<span>llm</span></a> <a href="https://sigmoid.social/tags/ise2025" class="mention hashtag" rel="tag">#<span>ise2025</span></a> <a href="https://sigmoid.social/tags/historyofScience" class="mention hashtag" rel="tag">#<span>historyofScience</span></a> <a href="https://sigmoid.social/tags/AI" class="mention hashtag" rel="tag">#<span>AI</span></a> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention">@<span>enorouzi</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention">@<span>sourisnumerique</span></a></span></p>
Harald Sack<p>Next stop in our NLP timeline are the (mostly) futile tries of machine translation during the cold war era. The rule-based machine translation approach was used mostly in the creation of dictionaries and grammar programs. It’s major drawback was that absolutely everything had to be made explicit.</p><p><a href="https://sigmoid.social/tags/nlp" class="mention hashtag" rel="tag">#<span>nlp</span></a> <a href="https://sigmoid.social/tags/historyofscience" class="mention hashtag" rel="tag">#<span>historyofscience</span></a> <a href="https://sigmoid.social/tags/ise2025" class="mention hashtag" rel="tag">#<span>ise2025</span></a> <a href="https://sigmoid.social/tags/lecture" class="mention hashtag" rel="tag">#<span>lecture</span></a> <a href="https://sigmoid.social/tags/machinetranslation" class="mention hashtag" rel="tag">#<span>machinetranslation</span></a> <a href="https://sigmoid.social/tags/coldwar" class="mention hashtag" rel="tag">#<span>coldwar</span></a> <a href="https://sigmoid.social/tags/AI" class="mention hashtag" rel="tag">#<span>AI</span></a> <a href="https://sigmoid.social/tags/historyofAI" class="mention hashtag" rel="tag">#<span>historyofAI</span></a> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention">@<span>enorouzi</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention">@<span>sourisnumerique</span></a></span> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention">@<span>fizise</span></a></span></p>
Harald Sack<p>Next step in our NLP timeline is Claude Elwood Shannon, who already laid the foundations for statistical language modeling by recognising the relevance of n-grams to model properties of language and predicting the likelihood of word sequences.</p><p>C.E. Shannon &quot;&quot;A Mathematical Theory of Communication&quot; (1948) <a href="https://web.archive.org/web/19980715013250/http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf" target="_blank" rel="nofollow noopener" translate="no"><span class="invisible">https://</span><span class="ellipsis">web.archive.org/web/1998071501</span><span class="invisible">3250/http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf</span></a></p><p><a href="https://sigmoid.social/tags/ise2025" class="mention hashtag" rel="tag">#<span>ise2025</span></a> <a href="https://sigmoid.social/tags/nlp" class="mention hashtag" rel="tag">#<span>nlp</span></a> <a href="https://sigmoid.social/tags/lecture" class="mention hashtag" rel="tag">#<span>lecture</span></a> <a href="https://sigmoid.social/tags/languagemodel" class="mention hashtag" rel="tag">#<span>languagemodel</span></a> <a href="https://sigmoid.social/tags/informationtheory" class="mention hashtag" rel="tag">#<span>informationtheory</span></a> <a href="https://sigmoid.social/tags/historyofscience" class="mention hashtag" rel="tag">#<span>historyofscience</span></a> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention">@<span>enorouzi</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention">@<span>sourisnumerique</span></a></span> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention">@<span>fizise</span></a></span></p>
Harald Sack<p>We are starting <a href="https://sigmoid.social/tags/ISE2025" class="mention hashtag" rel="tag">#<span>ISE2025</span></a> lecture 02 with a (very) brief history of <a href="https://sigmoid.social/tags/NLP" class="mention hashtag" rel="tag">#<span>NLP</span></a> pointing out only some selected highlights. Linguist Ferdinand de Saussure was laying the foundations of today&#39;s NLP by describing languages as “systems.” He argued that meaning is created inside language, in the relations and differences between its parts.</p><p>Course in general linguistics. <a href="https://ia600204.us.archive.org/0/items/SaussureFerdinandDeCourseInGeneralLinguistics1959/Saussure_Ferdinand_de_Course_in_General_Linguistics_1959.pdf" target="_blank" rel="nofollow noopener" translate="no"><span class="invisible">https://</span><span class="ellipsis">ia600204.us.archive.org/0/item</span><span class="invisible">s/SaussureFerdinandDeCourseInGeneralLinguistics1959/Saussure_Ferdinand_de_Course_in_General_Linguistics_1959.pdf</span></a></p><p><a href="https://sigmoid.social/tags/linguistics" class="mention hashtag" rel="tag">#<span>linguistics</span></a> <a href="https://sigmoid.social/tags/historyofscience" class="mention hashtag" rel="tag">#<span>historyofscience</span></a> <span class="h-card" translate="no"><a href="https://wisskomm.social/@fiz_karlsruhe" class="u-url mention">@<span>fiz_karlsruhe</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@fizise" class="u-url mention">@<span>fizise</span></a></span> <span class="h-card" translate="no"><a href="https://sigmoid.social/@enorouzi" class="u-url mention">@<span>enorouzi</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@tabea" class="u-url mention">@<span>tabea</span></a></span> <span class="h-card" translate="no"><a href="https://fedihum.org/@sourisnumerique" class="u-url mention">@<span>sourisnumerique</span></a></span> <span class="h-card" translate="no"><a href="https://mastodon.social/@KIT_Karlsruhe" class="u-url mention">@<span>KIT_Karlsruhe</span></a></span> <a href="https://sigmoid.social/tags/AIFB" class="mention hashtag" rel="tag">#<span>AIFB</span></a></p>