Posts

My post on Raygun

 #Raygun thank you for failing.  It is OK to fail.  Fail, move on.  Our own President Biden failed recently.  Shane Gillis failed on SNL.  George Santos failed.  Ingrid Andress on national anthem.  Doing good is better, but OK to fail.  Who else?

Random Posts - ML Learning

 More random posts on ML: https://github.com/chetan51/linguist?tab=readme-ov-file https://github.com/chetan51/linguist https://github.com/as3445/NuPIC-Stock-Prediction https://github.com/ywcui1990/TrafficPrediction https://github.com/youngminpark2559/connections_between_HTM_video_and_NuPIC_code ==== https://www.cell.com/neuron/pdf/S0896-6273(17)30509-3.pdf Reinforcement Learning Alongside its important role in the development of deep learning, neuroscience was also instrumental in erecting a second pillar of contemporary AI, stimulating the emergence of the field of reinforcement learning (RL). RL methods address the problem of how to maximize future reward by mapping states in the environment to actions and are among the most widely used tools in AI research (Sutton and Barto, 1998). Although it is not widely appreciated among AI researchers, RL methods were

Ok posting again - more business things

 Ok posting again - more business things

Building the OpenJDK on Linux x86 with logs

Some have had difficulty building the OpenJDK. The OpenJDK build system is designed for different operating system platforms, different architectures, and different host configurations so there are some steps that aren't entirely intuitive. Here is a full listing of the commands I used to build on x86 Linux. Ubuntu 10.04 as of June 2011. sudo apt-get install gawk alsa alsa-base alsa-utils alsa-tools libasound2-dev sudo apt-get install libx11-dev libxt-dev x11proto-xext-dev libxext-dev x11proto-input-dev libxi-dev sudo apt-get install libxtst-dev libmotif-dev xutils-dev libfreetype6-dev sudo apt-get install libcups2-dev ant g++ libxrender-dev libfreetype6 libfreetype6-dev freetype2-demos sudo apt-get install x11proto-print-dev # Run these at the command line: export LANG=C export ALT_BOOTDIR=/usr/lib/jvm/java-6-openjdk export ALT_JDK_IMPORT_PATH=$ALT_BOOTDIR export ALLOW_DOWNLOADS=true make sanity make Using OpenJDK version: openjdk-6-src-b22-28_feb_2011.tar.gz Additional Comman

Simple OpenJDK compiler build

If you are interested in a simple build for the open jdk compiler. See this mirror project: http://jvmnotebook.googlecode.com/svn/trunk/javac_compiler/JavaSource To build, type: mvn package To run the compile: mvn exec:java -e -Dexec.mainClass="berlin.com.sun.tools.javac.MainJavac" -Dexec.args="Test.java"

Basic Math : Basic Summation

Given the basic expression for summation: \[ \begin{align} \sum_{i=1}^n i \end{align} \] Calculating the scalar sum: 1 + 2 + 3 + 4 ... public class ScalarSum { public int sum(final f f, int j, int n) { int sum = 0; for (int index = j; index <= n; index++) { sum = sum + f.$(index); } return sum; } public interface f { public int $(final int x); } public static class Fx implements f { public int $(int x) { return x; } } public static class Fx2 implements f { public int $(int x) { return x*x; } } public static class Fx3 implements f { public int $(int x) { return x*x*x; } } } System.out.println("Sum: " + new ScalarSum().sum(new ScalarSum.Fx(), 1, 100)); With haskell: summation1 :: Integer -> Integer -> Integer summation1 x y = summation' x y 0 summation' :: Integer -> Integer -> Integer -> Integer summation' x y sum = if (y<x) then sum

MathJax Test - Looks like it is working

\[ \begin{align} \nabla \times \vec{\mathbf{B}} -\, \frac1c\, \frac{\partial\vec{\mathbf{E}}} {\partial t} & = \frac{4\pi}{c}\vec{\mathbf{j}} \\ \nabla \cdot \vec{\mathbf{E}} & = 4 \pi \rho \\ \nabla \times \vec{\mathbf{E}}\, +\, \frac1c\, \frac{\partial\vec{\mathbf{B}}}{\partial t} & = \vec{\mathbf{0}} \\ \nabla \cdot \vec{\mathbf{B}} & = 0 \end{align} \] \[ \begin{align} \sum_{k=0}^n a \end{align} \] \[ \displaystyle \sum_{j=1}^n a_j = \left({a_1 + a_2 + \cdots + a_n}\right) \] \[ \displaystyle \sum_{k=m}^n f_k \left({g_{k+1} - g_k}\right) = \left({f_{n+1} g_{n+1} - f_m g_m}\right) - \sum_{k=m}^n \left({f_{k+1}- f_k}\right) g_{k+1} \]