GlaDOS

GlaDOS

EnglishRVC V2TTS / RealtimeFictional
πŸ‘€

502

πŸ‘

19

πŸͺ„

426

Description

This is a long overdue post but anyway funny robot lady

Comments

im happy that i beat portal 1

both Portal games are so good man

Do yourself a favour and play both.

i have both anyways and im at the part in portal 2 where wheatley is in control.

"I am a potato"

Clap* Clap* Thank God Thats Still Working.

GLaDOS sounds like she's trapped me in a room and is trying to give me a lesson on lesbians

This is what it feels like to be a personality core or a turret in Aperture

this ones the best one a simple robo effect thingy would make it way better fr

she has it if you speak like her in a monotone voice like the "s" and sometimes "f" have a robotic sound to them

ye but not every one is going to speak like that

or well ig if you OVERTRAIN it

to like FORCE it to be like the dataset

true I do it when using the model with real time just to make it actually sound like her

I could never

Overtraining can both be good and bad

sometimes its good

I guess depending on what your model is supposed to sound like

even if you want it to sound human from my tom foolery sometimes the over trained models sounds better tbh But well it might as well NOT be over trainned cause im just basing this on the two graphs of total loss even tho thats NOT how you tell if its over training or not

grad norm and then loss fm and loss mel and so many other stuff

so i just never bother with graphs and tell by just testing and picking the best one

That's the best way to check anyway

its the easiest atleast

set up a W&B for RVC 😭 /j dont even bother

Weights and Biases

its the holy of graphs

its a mess to set it up but its great

literally try looking at a guide for this shit

"basic" set up <

also ye its useless to set this up for RVC RVC is NOT deserving of this and not worth it

I honestly don't know too much about RVC or training ai models in general

same you just do this and goofy parameters and go brrt

batch 4 for dataset of 10 min and 8 for 20 min we dont talk about the code level tweaks but :trolley: <

Fr I always use batch 8 for everything because idk how it works

high batch size is learning fast and a low batch size is learning slower so better for smaller datasets :trolley:

dont do batch size 1 :gru:

Batch size 1 sounds like hell

How long for just one singular epoch?

I had batch size 8 on a 9 second dataset of a lego tusken raider :boohooh:

I probably should redo it with size 4 or something

not only slow but it will make your model shit tier

i'd do 3 tbh

Pretty decent

Add a comment

Post

Samples

New
Classic

1. Singing

Male

English

2. Singing

Female

English

3. Singing (Dry)

Female

English

4. Singing (High)

Female

English

5. Singing 2

Male

English

6. Singing (Dry)

Male

English

7. Singing (Dry, High)

Male

English

Pitch

0

Weekly Metrics

Collections with this model

Favorites

Favorites

10 models

Enzo Vieira user image
Enzo Vieira
GlaDOS

GlaDOS

1 models

Brandy Roller user image
Brandy Roller
Models

Models

2.1k models

serf user image
serf
The O5's

The O5's

15 models

Anthony Kenyon user image
Anthony Kenyon