softmax and log

A couple of simple ones. Softmax and log.

Here is the python:
  def softmax(self):
    result = copy.deepcopy(self)
    the_sum = sum(math.exp(x.value) for x in result.data)
    if the_sum > 0:
      for x in result.data:
        x.value = math.exp(x.value)/the_sum  
    return result

def log(x,t=None):
  if x <= 0:
    return 0
  if t == None:
    return math.log(x)       # default is base e, ie natural logarithm
  return math.log(x,t)       # choose another base
Now, putting them to use:
sa: softmax 10|x>
|x>

sa: softmax (10|x> + 5|y> + 30|z> + |u> + 0.5|v>)
0.0|x> + 0.0|y> + 1.0|z> + 0.0|u> + 0.0|v>

sa: softmax (20|x> + 25|y>)
0.007|x> + 0.993|y>

sa: log |x>
0|x>

-- default is base e, so log(e) = 1.
sa: log 2.71828 |x>
1.0|x>

sa: log[2] 128|x>
7|x>

sa: log[10] (5|a> + 10|b> + 100|c> + 375|d> + 123456|e>)
0.699|a> + |b> + 2|c> + 2.574|d> + 5.092|e>



Home
previous: spreading activation in bko
next: edit distance in bko

updated: 19/12/2016
by Garry Morrison
email: garry -at- semantic-db.org