-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can a loss be negative? #1
Comments
Hi Well, the code used to work in 0.4. I have not tested on 1.1 yet. Maybe some api's might have undergone some changes. And what exactly do you mean by writing is not good. Is it like very small or not in a line or some strokes go very long etc? However, there is one thing I would like you to modify and check Change this line: Check if that works well while generating sequences. |
Hi. I made the change with self.training but the result looks the same as before. Thanks for the code. This is my modifed train.py:
|
Aah, so by gibberish you mean there are no words? That is expected because the current version is unconditional. It just randomly generates strokes which look like handwriting. I did not get around to implement the conditional version, where you can essentially give a word as an input and the model will write that word in the handwriting. |
Hi. Sorry I didn't explain myself well.
I mean that what is generates can't be considered hand writing but random
lines.
I know it's unconditional (actually what I wanted).
Cheers.
…On Sat, May 4, 2019 at 3:58 AM Nabarun Goswami ***@***.***> wrote:
Aah, so by gibberish you mean there are no words? That is expected because
the current version is unconditional. It just randomly generates strokes
which look like handwriting. I did not get around to implement the
conditional version, where you can essentially give a word as an input and
the model will write that word in the handwriting.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#1 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ADA4E43WF77BIICODC4TU63PTTUTDANCNFSM4HJ4LXNQ>
.
|
sorry,I don't konw why the loss will be negative? |
Hi.
Yesterday I adapted the net to run with pytorch 1.1.
After 40 epochs its writting was not good at all and errors are not improving.
But what most confused me is that loss is many times negative. Can a loss be negative?
Thanks.
The text was updated successfully, but these errors were encountered: