1. #1
    Deleted

    Streaming - Upgrading my CPU

    Hey everyone, I started setting up my stream again yesterday and it seems my current setup isn't handling it all too well. I'm constantly at max CPU load and the steam is hanging every few seconds, so I'm considering upgrading my CPU.

    if you take a look at this video for more than 20 seconds you'll see what I mean : Example

    I will mainly be streaming World of Warcraft.

    My current setup:
    • Asus P8P67 PRO
    • Intel i5 2500K
    • Nvidia GeForce GTX 560 Ti Twin Frozr II (2GB)
    • Corsair Venguance Dual Channel DDR3 8GB

    I was thinking of getting an Intel i7-4770K since it's the same socket, I could keep my motherboard and just replace the CPU. (I have an H80 Cooler btw).
    Will this upgrade make it possible to stream at 1080p without maxing out the CPU?

    Thanks in advance!

  2. #2
    Do you have an OC on that chip? You mentioned a good cooler, so you could OC if you have not already and see if that improves it enough for you. If not, well, you were thinking of upgrading anyway.

    Also, what are your internet speeds?

  3. #3
    Fluffy Kitten Remilia's Avatar
    10+ Year Old Account
    Join Date
    Apr 2011
    Location
    Avatar: Momoco
    Posts
    15,160
    It's not the same socket. Sandy bridge is LGA 1155. Haswell is LGA 1150. If you are going for an i7 haswell, get the 4790k instead.
    Did you oc the CPU before you considered this?

  4. #4
    Deleted
    Hey thanks for the replies. Yes I OC it to 4.5GHz for my test stream. Good that you mention the difference in socket! I totally missed that last 5 when looking through the info.

    EDIT: @Larthais: My internet speed is ok I think (around 5 Mb/s upload). I had 12 dropped frames over a 3 hour stream, so that's not the issue afaik.
    Last edited by mmoccd637925f1; 2015-02-16 at 10:23 PM.

  5. #5
    Eh you dont need an i7 for streaming, i used to stream all the time on twitch with the exact same CPU as you. Use OBS and use game capture, dont mess with any of the CPU settings they are fine out of the box just tweak your bitrate etc. My CPU usage barely went up when i streamed, you are either not using game capture or something else is eating your CPU up.

  6. #6
    Deleted
    @Fascinate I am using game capture. WoW takes up 20-30% and OBS takes up all that's left. I don't get how your CPU load barely went up, I mean encoding is a heavy task...

  7. #7
    You are doing something wrong. I couldnt even TELL i was streaming thats how little my CPU usage went up. Go back into OBS and reset everything to default and just set your bitrate/fps etc dont touch any of the encoding stuff. Trust me man, your cpu is more than enough.

  8. #8
    Field Marshal
    10+ Year Old Account
    Join Date
    May 2010
    Location
    Russia, Saint-Petersburg
    Posts
    83
    OBS have an option to load your GPU instead of CPU for x264 video compression. Ive got an old PC (core 2 quad 2.4 Ghz + nvidia GTX660) and it can easily stream any game 1920*1080*30fps.
    PS: http://i.imgur.com/0xuAoVA.png
    Last edited by bagosham; 2015-02-16 at 11:41 PM.

  9. #9
    He cant actually p67 boards dont have onboard video. But he really doesnt need too, a 2500k is more than sufficient they just need to tweak some settings or their pc isnt running properly in some way or another.

  10. #10
    Field Marshal
    10+ Year Old Account
    Join Date
    May 2010
    Location
    Russia, Saint-Petersburg
    Posts
    83
    Fascinate , he have nvidia GTX 650 and OBS will be able to use it.

  11. #11
    Oh i read that wrong thought you were talking about quick sync. (which is another option OBS uses to reduce cpu load)

  12. #12
    Quote Originally Posted by bagosham View Post
    OBS have an option to load your GPU instead of CPU for x264 video compression.
    I've no experience with OBS, but are you really sure it's x264 on the GPU? As far as I know there's no hardware support for x264 on GPU's, making them rely on much lossier alternatives.

  13. #13
    Fluffy Kitten Remilia's Avatar
    10+ Year Old Account
    Join Date
    Apr 2011
    Location
    Avatar: Momoco
    Posts
    15,160
    Quote Originally Posted by Raphtheone View Post
    I've no experience with OBS, but are you really sure it's x264 on the GPU? As far as I know there's no hardware support for x264 on GPU's, making them rely on much lossier alternatives.
    http://www.anandtech.com/show/7492/t...80-ti-review/3
    It's H.264 codec though NVidia's implementation may not be great, not really sure, never bothered with it for streaming. x264 = H.264 essentially in the terms that one leads to the other. It's the implementation that matters.
    Last edited by Remilia; 2015-02-18 at 10:08 PM.

  14. #14
    Quote Originally Posted by Remilia View Post
    http://www.anandtech.com/show/7492/t...80-ti-review/3
    It's H.264 codec though NVidia's implementation may not be great, not really sure, never bothered with it for streaming. x264 = H.264 essentially in the terms that one leads to the other. It's the implementation that matters.
    Yeah I suppose.

    It's a bit of cheating though, since GPU and CPU encoding are completely different in nature (serial vs parallel). Only way I know of to allow "x/h264" video encoding on the GPU involves heavy compromises in quality (regardless of codec). Personally, I wouldn't call that x264. No, scratch that, x264 encoding is x264 encoding, regardless of implementation. I wouldn't consider it as an alternative, given I like many others look to x264 for quality first, file size as a really nice bonus, not the other way around. Which is such a shame, the (nearly lack of) frame loss with Shadowplay is really awesome.

    I am curious though why they're capping the capture at low quality settings and offer monstrous Mbps settings to (somewhat) compensate. Can't wrap my head around it, and can't find sources that explain it.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •