Revision as of 17:16, 5 October 2008 by Jhunsber (Talk)

$ g(x+y) = \frac{g(x)+g(y)}{1-g(x)g(y)} $

$ \lim_{h \to 0} g(h) = 0 $

$ \lim_{h \to 0} \frac{g(h)}{h}= 1 $

a. Show that $ g(0) = 0 $.

b. Show that $ g'(x) = 1 + [g(x)]^2 $

c. Find $ g(x) $ by solving the differential equation in part (b).


Anyone know where to start? I'm defeated at every turn; I can't break the function into even/odd portion that have any use and none of the laws of exponentials/logarithms seem to be very useful. The only fact I can pull out is that $ g'(0)=1 $ which can be determined through L'Hopitals.

--Jmason 15:28, 5 October 2008 (UTC)


  • You can show g(0) = 0 by solving for g(x) (Yes, you can do it. No, it's not that hard), and then plugging 0 in for x. As for the other parts, I haven't got that far yet. I'll see what I get. And wow, I've been working on this problem a half hour already, I think.Jhunsber
  • Part b is a lot trickier. Remember that $ g'(x)=\lim_{h\to 0}\frac{g(x+h)-g(x)}{h} $ If you solved for g(x) for part a, you can plug that in substituting h for y since if h=y, then g(x+y) will = g(x+h). From there, just simplify and reverse distribute until you get the answer you need. Jhunsber

Alumni Liaison

Prof. Math. Ohio State and Associate Dean
Outstanding Alumnus Purdue Math 2008

Jeff McNeal