Hevc bitrate for 1080p reddit For screen recording video you want closer to qp=20. Depends on the bitrate, but 1080p doesn't need much power if the bitrate is reasonable. Trying to nail down the lowest good looking bitrate isn't worth the time. For everyday use I just used 1080p 60fps at 25mbps. The higher resolution and bitrate you send to YouTube the better image you will get. 264 standard file, and a 720p h. As for streaming, depends on YouTube vs twitch, but just google what their recommended bitrate settings are for the resolution you want to hit. See the screenshot below. Recent digital movies will occupy the low end of that range, old grainy analog the high end. You shouldn't need to buy the HEVC codec from the Microsoft Store since that one is for playing the videos in the Windows video player. I use my 3080 when I don't feel like firing up my streaming rig. x264-NTb 1. " I have a bunch of H. The difference to me is noticeable especially for new anime. Usually the issue is with the client not being able to direct play. I know that generally H. 1:1 canvas:output For 1080p 60, 50 mpbs should be plenty. Use CQP instead. 1. In your case, imo 2500kb/s bitrate is a bit low for 1080p and I think 8000kb/s would be a very good spot (for 4k I usually aim at 20mbps or something close), this is of course if you want a good quality, otherwise if you feel like 8000kb/s is still a bit too much you can download something lower, you are the one whose gonna watch the content in 1080p is usually pretty good between 5-8 Mbps, maybe 15 for transparency. In an ideal world, they shouldn't even encode that already small 1080p to even smaller 1080p — to downscale? Sure. 2 GB, a higher bitrate 1080P h264 version that's 3-4GB, and then 720p and low-bitrate 1080p versions in h/x. Unless you need to stream or achieve a specific size the bitrate is obsolete. 264 and be the same or better quality. VIDEO TAB Set Constant Quality slider to 22 Set Video Encoder to H. A 720p encode with a good bitrate and encoding quality will outperform a mediocre 1080p encode. Even upscaling 1080p content to 1440p/4K will provide better quality than just 1080p videos. I don't know who would go over say 20 Mbps for 1080p hevc anyway. Competitors to HEVC have been kind of tough to get as much of a foothold I have some footage from trips and wanted to find the best way to encode them to HEVC with speed and quality. AMD is terrible at anything above 1080p, even with HEVC. Just wanted to know what you guys prefer. ) require a lesser bitrate to achieve "similar" level of quality compared to the If you must shoot 1080p60 you’re gonna want at least 7. Constant quality encoding is the inverse of average bit rate ABR encoding. e. BluRay gets converted to H. 85M (Combined) of this number we allow 256k for audio (aac 5. I currently stream to YouTube at 3440 x 1440, 60fps with a bitrate of 22000 Kbps, using the Nvidia Nvenc HEVC codec (P5 preset). 264 is 2 Mbps for 720p and 3-5 Mbps for 1080p depending on the source: 3 Mbps for TV series and 5 Mbps for movies. Reddit's home for let's players, streamers, and any other kind of gaming content creator As 1080p HEVC is rarely released, you mean those x265 re-encodes that use the 3-4GB ones as source, I guess. Of course it is looking better with my native resolution - sharper, but with incresead motion and screen noise there are more compression artifacts due to lower bitrate. 6 to 1 (more complex/detailed = need higher value, don't go beyond 1 as you may introduce ringing artifacts. Especially if you're using the 8-bit encoder. Netflix recommends 15Mbps for UHD) so I've been trying to find a CQ value that delivers that bitrate. I'm planning on getting a 4k tv with OLED and HDR in the near future (and slowly re-downloading my plex library to 4k) and wondering if I should just leave h264 files in Downgrading of resolution - lets say I have a 1080p TV, and a 4k file. 2GB is really pushing it though. Right now, when I encode UHD 4K HDR files with HEVC/H265 using CPU, I'll usually average around 0. Use VBR with the maximum bitrate set to 3300kbps and the base something like 2000 or 1500. It was fixed for me as soon as I switched to 60 Hz, no matter the Vsync setting. That's called using a lower bitrate / higher CRF / whatever. Values might be different for HEVC and h264 due to the nature of the codec. From some googling, it seems that a good bitrate for 720p is 2-4mbps, a good bitrate for 1080p is 4-6mpbs, but I can't find many consistent sources on a good bitrate for 4K. . So you may use H. Main advantages * HEVC is higher-quality and more efficient than AVC. What is H. All unneeded audio tracks are dropped, and kept ones have Passthrough enabled, so it won’t get transcoded twice (I can forgive a lesser video experience than audio experience). Streaming would require 720p/30 due to AMD's encoder quality issues, and bitrate compensation. For h. And X video streams at 1080p/1100 bitrate. 5 Mbit/s. AV1 excels at high resolution, low bitrate. You will mostly use 0. You can cut this back by 30-40% for x265 on most content but challenging content much less so. For archival videos with a nice high bitrate, either codec is fine. Note: I am generalizing and assuming OP only wants to turn the bit rate knob here and doesn’t want to get in to the weeds with the many tweaks OP could configure within their hevc compression settings. Since you stream on YouTube, use HEVC instead. In fact, most hardware encoders don't use it and opt to only implement simpler rate control methods, like CQP. The end user never gets 1080p. 264 file I mentioned? CRF23 will yield outstanding compression on HEVC, but without 10-bit, you'll get banding problems with an 8-bit encode at that CRF. For me, anything in 1080, I try and stay above 10gb per movie so that's a bit rate of around 10k-12k kbps. 265 files can be 25-50% less bitrate than H. But good to know about stickers and such, I never thought about it! For movies I really love, its a 1080p or 4K remux for sure. it literally stands for High Efficiency Video Coding Really late reply (found via search), but from doing a bunch of tests today on OBS(v30) using 4070TS & NVENC AV1, I needed roughly 50% extra bitrate from 1080p to 1440p to reach the same level of clarity, against very specific facial features of a GIF animation I have overlaid in the corner of the video, looping while very high motion/noise gameplay goes on in the background. MAX. 1 448 / 640 kbps. I use HEVC on YouTube with a RX 6800XT at 1440p/20k kbps bitrate and streams look great. I've made extensive comparisons with my personal tuning for x265 last year (default tunings for x265/HEVC encoders are not really that good for preserving fine detail) and found out that AV1 washes away more fine low contrast detail than x265 at resonably bitrate levels (which is something difficult to quantify, but equivalent to around CRF 22 I would always encode to 1080p, h264, at a constant bit rate of about 9500mbps, passthrough original audio so an average 2 hour movie output file will be around 8-9gb's. Lock the file size of your encodes. However, a 4gb hevc 4k file, is comparable in quality to 2gb 1080p hevc file. 1 most of the time. It's when you start cranking on the compression (while keeping resolution) that AV1 really shines. So Im wondering how I could Set this up in tdarr. In RipBot264 you can make a 1080p Blu-ray variable bitrate encode that fits on a DVD9 disc. 265. Hey all, Currently, I have a Ryzen 2600 paired up with a GTX 1660 Super. H265 is not more efficient at high bitrate for a 1080p video, for the same bitrate I have my x264 encodes better than the x265 for much less time encoding. Bitrate decides the quality of the 1080p video. 265 10-bit (x265) Change Encoder Options Preset to Medium AUDIO TAB These are the situations where you get the much-hyped 50% reduction in bitrate. If you switch to 265 id drop that to around 7000kbps. To give an example, let's say that for live action 1080p video content I aim for 6Mbps (a personal decision). If it is H264, it must be at least 12000. I have a 4k TV, almost all my content is 1080p. In Plex Media Server Settings -> Remote Access, make sure both "Internet upload speed" and "Limit remote stream bitrate" are configured appropriately. 264? What about 4K? The answer is "it depends" lol Video quality mostly depends on the bitrate of the video. As another Redditor says in this thread, there is a YouTube video where someone uses 52 as the quality. That sounds too big for me compared to around 180 mb/min at CQP 21 for 1080p/60 recordings but I need to play around with HEVC and higher quality CQP to find my sweet spot for 1440p. 1080p is sharper. 265 file can look good in sizes as low as 100 mb. moving foliage), resulting in blocky/blurry footage. Preset: High Quality Encoding mode / Rate control: 2 -pass VBR Target bitrate: 1500K Maximum bitrate: 1750k VBV size: 2400k I use 10-bit x265 and Opus/AAC 5. My internet speed tests at 325mbps down / 10mbps up I'm reading though the FFmpeg guide on encoding AV1 and under the CRF section for SVT-AV1 it states: "For source files of 1080p and 720p av1 otimization is not optimal, most of the time is better to go with the H265 for better vmaf/size ratio. A high bitrate 1080p will look noticeably better than a mid bitrate 4k movie/show. I still confirm that a lower bitrate results in less compression, naturally (low bitrate = low video size). I want to start recording entire playthroughs of games in a single session, however 140Mbps only allows roughly 14. Get the Reddit app Scan this QR code to download the app now No i am also using nvenc hevc at 22000 bitrate but quality is blurry on games like high fps apex This is in advanced settings, but the Indistinguishable Quality, Large file size preset in Simple mode uses 16 CQP (just use that) CQP (Constant Quantization Parameter) is a quality target so it will raise and lower bitrate as needed to meet said target for each frame. Now they all ask me to upgrade any content with low bit rate/resolution. but it wouldn’t be so great for 60fps of an action fight scene. As u/spong_miester That being said if I usually encode my H. For bitrate I try to get stuff at more or less 6k bitrate or more. 264 movie? And would it be like 5 Mbps for H. 265, a 1080p file that's 300 mb can look as amazing as a 1. DDP5. Try 1440p at 60fps at 30,000kps and see if your network can maintain it. Does the computer handle the downgrade automatically, or should I run the file through some kind of software to downgrade the resolution, if I'm streaming over the network? Some linux iso's will be titled "x265 HEVC 10-bit", but the word HDR is missing. None of the torrents for this series are in 4k. 1Mbps depending upon action in the video. In things with more detailed textures, grass, and foliage, th Realistically you need about 8000 bitrate minimum for 1080p 60fps to not look like garbage, but I’m pretty sure YouTube caps bitrate for streaming at even lower than 6000. 264) 1080p movies to HEVC (h. 265 10-bit (x265), RF (not RQ) is a Constant Quality value, that indicates a quality threshold (relative to all the other settings, not absolute) where larger number is lower quality, Preset veryslow means it takes longer to process as it takes more information into account (which results in higher quality and smaller size for a given RF {for example Screenshot attached; Witcher S1:E1 => 1080p - 4536 bitrate, 1440p (HEVC) - 1262 bitrate. I'm editing my old posts about this to say specifically: AOM is unsuited to sharp quality in 720p. I've heard HEVC is better if you're not streaming. My rule of thumb for H. Select 60 as the frame rate. I know for YouTube, 25-ish Mbps is fine, their VP9 encode becomes a bottleneck here (from my experience), probably even fewer bits considering I actually scaled up my 1920x1080 recordings to 2560x1440 @ 25 Mbps. 265 encodings. Higher value is smaller file, but less quality. What I find suitable for h. 265 10-bit), and not QSV. What type of Monitor should I purchase? 16:9 or 21:9 - or even wider?! 1080p or 4k? 60hz or 144hz - or 360hz?! All these questions and more can be answered here. Still lots of love for h264 though as some people say its better. Not all encoders use CRF. Jan 29, 2024 · * 40-50% bit rate reduction compared with H. 265 because it encodes very slowly The newer profiles can both be faster and deliver a better quality to bit rate ratio. For 1080p h264 OR HEVC aka H265 is fine. Your CPU will be fine for long encodes, but it might take 4-8 hours depending on the film. Otherwise, 4-6GB Web-dl will do just grand. So a h265 video with a lower bitrate can have the same or even better quality than a h264 video with a higher bitrate. 90%+ of my content is 1080p and everything new I download has a minimum video bitrate of 4-5Mbps if H264 or 3-4Mbps if HEVC. Bitrate and compression time is a game of diminishing returns. 8-1. 720p only looks marginally better for twitch than 1080p for the AMD encoder, so might as well just use 1080p. With high bitrate, the performance gap between AV1 and HEVC is very small. ABR encoding asks what bit rate do you need and based on what you set will vary the quality to meet that. WEB-DL. For example a mpeg2 encode could be 10X the size of h264 without any benefit to quality. Besides, if I'm spending (sometimes upwards of) 24hrs to encode using HEVC I shudder to think what that would be with AV1, lol. 1/7. For example, if HEVC is 30% more efficient than H264, and you have a 300GB of recordings, recording in HEVC would save you 100GB of storage space. So only twice as big a file size, but similar quality, thanks to better hevc compression at higher resolutions. I see what you're saying about the CRF/bitrate, but I'm not actually targeting a bitrate. Jul 21, 2024 · As you didn't give any specs and what you're recording, I'm gonna assume 1080p60 and some Nvidia GPU. I am a filthy casual and try to stay around the 10Mb/s rate which gives you a 6-10GB file. Transcode 4k/HDR10/HEVC down to 1080p and keep HDR and HEVC and the original files Hi, I dont know if plex just convert these files down to h. In general, I recommend recording with H265/HEVC since AMD's H264/AVC encoder doesn't output as good quality. I've been recording gameplay videos for the past few months and have been using the H264 format with a bitrate of 140Mbps to capture footage in 4K at 60 FPS. 265 VTB (default 8-bit) produces a file that is 50% the size of the original h. 265 encoder. High bitrate 1080p will be better than meh bitrate 4k even at h. You can have X video be streamed at 1080p resolution, and bitrate be at 500. 5 hours of recording on my 1TB SSD while most of the games I want to 25mbps is a lot for 1440p HEVC. I just want to determine the best bitrate for 1080p without going overboard and gettin no value for it. 1) I am currently using the DOOM plugin modified to encode a HEVC file (Meaning original is HEVC, going to a lower bitrate HEVC) Using bframes 5 (Not sure what this best setting is) However, bitrate is the cure all for stream/video quality, and YouTube supports up to 50,000 Kbps (50 megabit) upload. 2GB h264 files. 1 sharpening filter to crisp up the gameplay a bit. I would highly recommend RF20 or better (meaning lower number) with 10-bit H. Compressor only makes a few settings available to the user. HEVC might perform better on the AMF encoder, but is harder to use in post processing. There are 2 factors going into this: high bit-rate 1080p is obviously the best, but over a giga byte for a single episode is a bit too much. That can be wanted or not. Update: Solution Switch to AV1. I find Amazon's bitrate and quality to be far better for digital releases than Netflix, so I try to find those wherever possible. HEVC 8-bit is horrible at banding, and I had to bump CRF up to 17 or so to get rid of it, yielding very large encodes. Now I downloaded a file from them that had the following title: 1080p. My test, which something I shot is 1080p 60fps, 1:50, and at a QBR of 18 and the encoding speed set to slow speed it still gives me a file size of 125mb. Here are my test images. Also bitrate is only one of many factors when creating both h264 and h265. Since your upload is 30 megabit at best and it's recommended to use only half, I'd say stream H. 265 1080p to be on par with H. I want to retain maximum quality while reducing the file size. Also, my Samsung TV (although it supports it) doesn't like HEVC and the audio sometimes cuts out, but it plays back H264 files no problem. Davinci Resolve will handle video codecs separately than what Windows has. Actually, RF23 is going to look pretty meh for 1080p, if you look closely. 1 audio supported (though last time i checked, matching audio output mode for audio recording wasn't keeping the proper channel count unfortunately). So 4k hevc has better compression than 720p hevc. HEVC MP4/MKV is better quality at the same bitrate as an AVC MP4/MKV. 7 mbps, roughly 20% of the original remux. As for what you should set it to, if you are at 1080p then 30mbps will work for most things. That said, resolution comes 3rd after encoding and bitrate. Also keep in mind that most anime is produced in only 720p so quality gains to 1080p are often not much. Based on what I know this means its a HBO Max WEB-DL enconded using x264(higher file size) and with E-AC-3(DolbyDigitalPlus) sound. 265 at 1080p is: RF20 if it's something I don't care particularly much about RF18 if it's something important its more complicated than that. Now that's just for recording 1080p/60. g. 20Mbps=20%. All else being equal, you can always compress more, at the cost of image quality. At the end of the day, everyone sees things differently. True? Is my bitrate correct? H. Going to start desktop screen recording along with webcam face cam for work. Both discs were encoded to HEVC as CQ RF 18, with an encoder preset to Medium, and audio always as passthrough. They are passing low bitrate 1080p off as 4k False advertising I'm at the moment doing test with StreamFX +gtx 1080 at 1440p@50 in VBR and so far it's going good. HEVC often just saves you another ~15% relative to the original remux. Use CRF/CQP instead of fixed bitrate or VBR. It maxes out to 720p currently based on the decision of higher ups. 1080p 60fps HEVC is good enough at 4mbps. There are currently no good UI's that do HEVC NVENC, xmedia recode has the bones, but it's not nearly robust enough. I would be using the Nvidia Nvenc AV1 encoder. I know twitch usually recommends 6000 kpbs for 1080p 60 fps. With the 1080p Blu-ray encode, I ended up with a 5. High bit-rate 720p is usually 700mb - 1gb and low bitrate 1080p is about 500-700mb. iTunes HD purchases max out at this bitrate and it looks decent. h. If it can’t reduce the bitrate and check the visual quality after stream has ended. However, my current tv is almost 15 years old, 1080p, and far too small for the room it's in. Don't stream, but I've been recording game play, mostly in 15 second clips, at 1080p/60fps with 20 bit rate in AVC for a couple of years. The bottom line is that with HEVC, people shouldn't just assume longer is better, but should play around with the settings through a sample of video to see what works It saves up more space than AC3 and HEVC encoding groups use it too. I'm more setting an upper limit. It would be for recording and whilst I don’t want a massive file size, I still want a good… Newer codecs with a higher efficiency (h265, VP1, AV1) use less bitrate for the same quality than older codecs like h264. I just wanna point things out that bugs me (it's just my personal opinion and watching a bunch of very technical YouTube tutorials) - upscaling the 1080p video to 1440 and uploading it to YouTube just for VP9 is useless unless the viewer watch the video in 1440p (which most of the people don't bother and just watch it in 1080p). thats what the "HEVC" designates in that smaller sized upload. Some people do quality hevc encodes using a good source, but it's kind of a pain to weed out the bad ones and only grab the good ones, because bitrate doesn't say enough about the quality of the encode, especially for hevc. QXR makes larger size encodes using HEVC. Even at 1440p with better YT compression, you barely see a difference above 8mbps in easier games to encode like Valorant and other games with simple geometric textures. I get either when watching shows/movies etc. If you are upscaling, add a 0. At the same low bitrate h265 should look significantly better, but doubling bitrate won’t always double the quality. I've not been able to find the right options to increase encoder complexity to extract better visuals at lower bitrates with hevc_qsv. I'll try to explain it as simply as possible. It's important to not conflate resolution and bitrate. So use something like --crf 22 --film-grain 10 --preset 6 And also you should consider maybe even using preset 5 for the 1080p clip. 264 encoder, though. 5-2. For "Anime" anime, I recommend psy-rd1 to 2 (more complex/action packed/detailed anime = higher value needed, also increases bitrate), aq-strength 0. Long story short, no matter what bitrate you send, if you try to stream 1080p 60fps, it’s gonna look really bad. You will get about 10-15% smaller files for the same if not better quality. I will passthrough Atmos and DTS-X, however, to keep the object based audio track. Are there any other tricks to push the filesize lower without a significant reduction in quality. YouTube compression at 1080p will be the limiting factor. It’ll look miles better than what h264 looks like with the AMD encoder. I’ve done 4K at 60fps with a bitrate of 80mbps before for testing purposes and it worked fine. Nvidia's HEVC encoder isn't as good as its . You can see that as soon as movement appears on the screen at 2500 kb/s both HEVC and H. I don’t mind the artefacts that much but like the high resolution and I don’t want to For recording, don't use Constant BitRate (CBR). Some say there is no difference in picture quality when using Medium or Slow in H. The compression and bitrate for 1080p YouTube content are abysmal, there's no need to squander your HDD space on that. 265 (and there are downsides, like less compatibility with older/lower end devices). The real benefit of HEVC is much better coding of UHD (and native support for 10 bit with optimization for HDR) But yeah. 265 (HEVC) at a rate of 50% of original bitrate, and DVD gets deinterlaced and converted at a rate of 33% of original bitrate. It just looks I've seen 4K movies encoded in 7 Mbits/s. I'm always perplexed that if searching for a show, usually there will be a 1080P h264 upload around 1. 265/HEVC, the latter of which is perceptually not as crisp as even the 1. Bitrate is not always the best metric because a video could be compressed very efficiently, or very poorly, resulting in a bloated file. I tried many different values with several different movies. Also not all 4k/2k is superior to 1080p if mastered or encoded poorly. I encode my blu-ray at H264 veryslow preset with CRF 18. SVT-AV1 crushes x265 when you let it. Will use yours as a starting point, thanks. From our facility, we produce at 1080p session, using 4K capable cameras set to 1080p output, and any edited video elements are delivered in 1080p, and use AWS and custom RTMP profiles to various CDNs around the country to deliver our stream. 265 is more efficient what bitrates should I be looking to target to achieve roughly the same results as the H. 264 files at an average bitrate of 3000kbps…. 1440p, the next resolution tier which doesn't even comes close to twice the amount of pixels 1080p has, is "given" by youtube more than twice the bitrate used for 1080p videos. 2-pass is basically looking through the entire catalog, and find the best value for the dollar. For 1080p, NVEnc HEVC, I definitely had to use more than 50 Mbps once I started caring about visual quality. If you still can’t maintain a high bitrate and the quality is bad, set it down to 1080p and try for a high bitrate at 1080p. For high quality 1080p encodes, at the same encoding speed (x265 is generally way slower than x264), there's really not much advantage to H. At 16000 kb/s though, its getting flipped around an now AV1 is clearly the most blurry encoder. You can do capped CRF too. H. I use rc=constqp qp=30 with I444 color for games and it looks great. Settings for 1080p HEVC BluRay I need to figure out what the best settings are to encode my Blu Ray Rips with HEVC. Use variable bitrate. i know for a fact a 570 can record 60fps 1080p @ maximum bitrate and with 7. Youtube guarantees transcoding for everyone so pushing a high bitrate won’t be a problem for viewers like it would be over on twitch. Help is appreciated. A cartoon needs much less bitrate than a grainy movie, and a grainy movie much more bitrate than a standard movie, as you don't know what bitrate is needed, the CRF will adapt itself to the complexity of the One way around that is to use 1080p and HEVC at 30 FPS, then the "Standard" bitrate saves at 30,000k but then, because it's HEVC, it means Lossless Cut can't handle it, it has to be H264. Follow the Twitch guidelines above for that then - they don't offer transcoding to all and disconnect you past 8000kbps, so keeping to 6,000 usually keeps you accessible to most viewers (going above that without transcoding available can lock out viewers Hey all! I'm currently streaming moonlight in my house via ethernet (using gamestream launchpad 1080p/60fps config). The other streams (like audio and subtitles) were only copied into the new file. Bitrate is needed for higher motion, and higher detail. Under the Video tab, Video Encoder H. I've seen posts mentioning that 20 is ok usually, and that 18 is better in some cases. If you're recording. The default is P6, the mediation is high quality, the multi-encoding is 1/4, and the main configuration is The rest can refer to NVIDIA's recommended settings: Assuming a 1080p/24 FPS video, HEVC beats the pants off H264 at lower bitrates like 2000k or 4000k, but this is only true at these lower bitrates. H265 is intensive and without hardware acceleration, I likely wouldn't touch HEVC for this use. I've tested 40 and 50 Mbit/s as well without any apparent change in decoding latency. 5mbps (ideally 10mbps) when using HEVC. Use CRF if possible. On my i7 8th Gen it takes 6-8 hours on Medium to encode a 1080p REMUX to H. CBR on recording just wastes space without improving quality. Choose "Average Bitrate" and select a bitrate that fits your file size needs. If it's h. x265 gives you roughly the same quality as x264 at half the bitrate. Generally h265/HEVC will have smaller download sizes but similar in quality as h264. Keep in mind that quality is very subjective, though. Yes, but the point is you can save a ton of storage by using HEVC to record at a lower bitrate while still keeping the same quality. For h264_amf, you need to specify that you want to use CQP and also the QP value for each frame type (I, P and B), something like this: -rc cqp -qp_i 10 -qp_p 10 -qp_b 10. 264 at the same visual quality * It is likely to implement Ultra HD, 2K, 4K for Broadcast and Online (OTT) * 40-50% bit rate reduction compared with MPEG-2 Part * Available to deliver HD sources for Broadcast and Online . 8 FPS and this can end up meaning sometimes over a full 24 hours to take a source UHD disc down to a reasonable CF17 encode. Would animated content (TV shows like South Park, Archer, etc. 1080p encodes often end up looking better not because of resolution, but because they are less lossy due to being encoded with inherently more data. The source that gave me those numbers was this. Do yourself a favor & stream 720p 60fps. 264 bitrate by 1. Currently using the Very Fast 1080p and changing fps to same as source, using . If you are looking to keep file sizes low, you will need to use bitrate-based software recording (i. Slow takes 8-16+ hours on my laptop. Convert the lossless audio to AC3 5. When the same source video is encoded at 16000k (one H264 and the other HEVC) I found that H264 actually comes out with slightly more detail to it on very detailed things like grass moving past the camera. And also use the HEVC encoder. 265? H. 265) with 2 pass encoding (using my GPU) and target bitrate set to 50% of original. Feb 3, 2024 · If you set Indistinguishable File Size, the bitrate setting is not used. 5GB file, and As of 2019-02-02 if quality streaming is wanted it's necessary to stream above 1080p (specially for 60fps). Is the bitrate problem with your Rokus or with your network? I *did* find bitrate recommendations though (e. mp4 , and HEVC makes encoding a 50min long video at ~2gb take ~32min on my Ryzen 5 5600. It's only about a 20% size benefit compared with h264 and the kicker is that I'd want to do aggressive compression for smallest possible file. I have said this before and I'll say it again - I doubt most people could tell the difference between 1080p and 4k in a "blind" test. 264 was designed to provide high-quality video compression while maintaining a low bit rate, which is why it is ideal for streaming videos over the internet. But, and it's a huge but - your results will vary. Maybe for a solid white background and black text it is great…. YouTube will reduce that on upload but, again, the best quality you can give it, the better the output. With the 1080p disc, I encoded to HEVC, keeping the 1080p resolution. I looked in many places and I am gettin so many incosistent answers. Your J1900 is about 3x less powerful than my laptop, so just do the math. 264 and loose hdr. Edit: it seems AOM is not at all sharp at 720p (similar bitrate) compared to x265, but it looks acceptable when the two are compared at 1080p (in fact with AOM having a smaller bitrate). That's how it saves space -- by being able to generate smaller files. I'd try and get that hevc issue sorted if possible as its a great space saver. For recording and YouTube, use amd hevc encoder CBR 8000 preset quality profile high 1 b frame. My general rule of thumb is that for most live action sequences at 1080p, h265 requires about half the bitrate of a visually equal (at least by my eyes) h264 encode. What is a good bitrate for 1080p 120fps gaming videos. There could be a quality loss and I'm not noticing it because of low-end hardware. Otherwise, you will have to transcode down to a lower resolution. 4-5Mb is most likely kinda broken and you would be better off with 1080p. I am looking for an ok quality with decant performance and file size, so more so on the lower end of recommended bitrate unless higher bitrate doesn't affect performance and space too much (i have a dedicated 1TB hard drive for footage but i am quite lazy with actually looking through it all so i dont clean up unwanted footage often). Which Air? M1? M3? An older intel one? Either of the new ones will have at least H. And as you then don't need to spend more bitrate for the grain like on x265 you can use a higher crf. HEVC Quality: 35 Encoder Speed: Medium AAC 96 Stereo (Resolution same as source, variable bit rate) Which averages around 400mb to 600mb per movie length content. if using hevc for 4k vs h264 for HD then yes can still be better with a lower bitrate because hevc bitrate is about (not precise but close enough) double the efficiency of h264 so a 1080p 20mbps hevc file is equal in IQ in theory to a 40mbps 1080p 40mbps h264 file Alright. Use NVENC, AV1 if available or HEVC as second option. Perhaps playing with the frame pacing settings could also help - I have it set to "Balanced". But what's the golden zone for both types? 10 Mbps for a 1080p H. It varies quite a bit by movie, but I find 1080p HEVC encodes around 6gb-12gb (non counting audio) to be indistinguishable from the original blu. But it'# better better to use CRF instead of 2-pass. Bitrate is a bigger indicator of quality (as long as the content is at least 1080). I would also stream at 60fps since you can. Sometimes the trade-off can be with compatibility, but this shouldn't be an issue with HEVC. The Force Awakens encoded at crf17 slow only has a video bitrate of 7. if 1080p with high bitrate can stand up next to 4k low bitrate and still look almost as good? They're identical bitrates, so as long as you're not upscaling the 1080 to 4K, they will (essentially) look the same at 1080. Always. My current setup is: RTX 3060 (host/W10) -> switch -> router -> gtx 1650ti (client/W10) 2-pass 5000 kbps for 1080p. 264 20mbps is less than what a blu ray is for 1080 at 24fps… so more than double the framrate and smaller bitrate…that’s not exactly a bar I’d consider “diminishing” at all. That being said, release groups usually encode their HEVCs at lower bitrates than their AVCs (small size RARBG 1080p x264 VS small size RARBG 1080p x265, for example). I have a question regarding desired bitrate when it comes to animated content. The code rate value is recommended to be 10000. I want to switch to AV1, what kind of bitrate would I need to maintain the same visual fidelity. 264 is not as efficient a codec as HEVC/H. I've seen that Netflix apparently uses a bitrate of 15mbps, but YouTube recommends a bitrate of 35-45mbps. TV Shows are 720p, 5-6Mbps, bluray rips if possible. So watching something like Mash you're focusing basically just on the characters and not the background stuff so Gamma " Brightness and Clarity " can be reduced without really losing anything of value and that's a great trade-off for a 30% reduction in file size. 5GB file, and average bitrate of 6972 kbps . You want CBR for streaming, or a quality setting (CQP, CRF, etc) for recording. However: I’m streaming 1080p 24fps Is most likely going to look all sorts of out-of-sync when YouTube forces it to 30fps. 265, I roughly divide the h. I'll also add my specs just in case it's hardware dependent: Ryzen 9 3900xt, rtx 3070 ftw3, 16gb ddr4. I always used the slower 2-pass encoding to make sure I got a good file with no issues and for a 2 hour movie I recall the process taking roughly 4-5 hours on my old stock These are my results, all with the same settings in OBS (other than bitrate and encoder). It's literally the only working option for AMD. The bitrate curve is mostly preserved, but total bitrate is raised/lowered based on your limits, resulting in a final product that's imo less arbitrary and takes better account of how each bit is best optimally used. If you're seeing SVT-AV1 underperform x265 in VQ at the same bitrate you're either testing very high bitrates, using too fast of a preset, or both. My laptop (i7-5600u) uses about 1% cpu for every Mbps of bitrate in sw decoding, i. 265 / RF 18 / Medium. 264 and HEVC encoders with the Apple hardware encoder. Try to use H265 and AVI as the live broadcast bit rate. mkv instead of . You will need to make sure that you either A have a more powerful playback device or B have a playback device with a dedicated HW decoder. For movies I prefer somewhere between 8-10Mbps 1080p. With an excellent quality, so if your videos are most static content, I guess the bitrate would be the same or even lower because compression would be more efficient, and the complexity would allow you to encode using Preset 6. since H. I have noticed grainy backgrounds in racing games at 30mbps, everything else works totally fine. For 4K Films (HEVC x265) If your system can handle it I run the nightly build version. 264. 265, also known as High Efficiency Video Coding (HEVC), is a video codec that was released in 2013. HEVC doesn't benefit much from higher presets. If you genuinely desire quality on YouTube, you need to increase the resolution. Bitrate is set to 30 Mbit/s, which is more than enough for 1080p with HEVC. When you watch something the human eye naturally focuses on Gamma and Movement. h265 for example needs about 50% less bitrate for the same quality compared to h264. 265/HEVC with 15,000 Kbps bitrate, and see what it looks like and if your internet is stable enough to not have dropped frames. It is possible the transcoded video will be at a higher bitrate than the original, as H. 264 lose quality immensely. I have encoded 1080p in h264 down to about 4GB with no visible loss of quality. 264 (AVC). And those 1080p WEB-DLs are usually already bitstarved. I’m looking forward to some optimizations for AV1 as the real successor to h. 264 1080p videos I'd like to transcode in order to save space. 6Mbps and 8. 4K you will probably need to crank it up to 80. I’m streaming 1080p 24fps content on YouTube and have 20mbps upload speed available. Where HEVC really shines is with recent, digitally shot films. Reply reply More replies mduell Though this may not be related to Handbrake, I use the following settings with NVENC on Xmedia Recode and Selur's Hybrid programs for 1080p content only: For HEVC: Video Encoder: NVENC HEVC/H. (2 pc setup, main rig with a rtx 3080), 42 to 55k bitrate and so far with cyberpunk 2077 the quality loss almost isn't there which for me it works. You always push as much bitrate as humanly possible to YouTube. I've done double blind tests with lossless/lossy audio and with a sufficient bitrate I can't hear the difference. HEVC is absolutely "worth it" in terms of quality for file size for 1080p, but if you only care about bitrate and not storage then there are better solutions. They are all 1080p and around 2GB each episodenoticing the shit bitrate on Netflix and Netflix being the only source for this series that tells me that what Netflix is saying is 4k is actually not 4k at all. A constant bitrate is the worst of both worlds: too much bitrate (more storage) for static periods, too little bitrate in high motion/detail (e. Select the AppleTV 2160p60 4K HEVC Surround Setting underneath Devices and make a few changes. I will confirm you when I arrive home. Great! I've found that higher bitrate 720p looks better than 1080p unless the 1080p file is at least double the bitrate of the 720p file, which comes at a significant storage cost. 265 can be hard on old HW to decode on the fly so run some test footage through your main device first. For 1080p, I've seen movies encoded in 1. Depends what you want. This produces a decent enough output but I'm hoping to extract either a little more quality or a little lower bitrate. For a live action film a 720p encode from a 1080p source would, at the same RF, always create a larger file than if you'd used a 720p source file. Hence the choice between the ones in the title. Start with a value of 18 and adjust from there. Compare that to the tv episode of 57 minute with a filesize of 277mb, and for that size the quality is astonishing. And if you've got older (= softer/grainier) movies, getting good results takes a lot of expertise. I'm just wondering if current AV1 encoders like aom or svt would be worth it over my current HEVC for time vs bitrate savings -EDIT 2: A lot of my videos I want to encode are livestreams where the VODs are not left publicly available after ending, so downloading a youtube-encoded video of vp9 or av1 isn't always possible, and my main focus of Well they are and aren't. This all sounds good, I check the bitrate in the torrent info and it says "Overall bit rate: 9 905 kb/s". basically Youtubes bit-rate limit for 1080p is too low, you could throw 50mbps 1080p at YouTube and it'll still compress it to piss poor, YouTube 1440p or 4K though gets a huge jump in bit-rate, so if you have the processing power and upload, then render out at 1440p then watch at 1440p or above on YouTube for a nice quality boost :) I use bitrate and 2-pass encoding with my tuned presets in handbrake. I've transcoded some of my AVC (h. 265 (HEVC) is an amazing codec. Now, to do a simple NVENC encode (that'll even work for 9XX series), start with: ffmpeg -i <inputfile> -c:v hevc_nvenc -profile main -preset slow -rc vbr_2pass -qmin 15 -qmax 20 -2pass 1 -c:a:0 copy <outputfile> This may seem like a low number but I am trying to optimize my library with a bitrate of 1. HEVC (x265) requires a lower bitrate to achieve roughly the same quality as an x264 video encoded from the same source. So 4k is 4 times as big a resolution as 1080p. 4 GB h. What I learned is that a quality value of 52 using HEVC h. But it sounds like actual bitrate may vary wildly, so that's probably a poor way to choose CQ. In my case, Y video streams at 7000 bitrate, very good video quality. some say 12mbps, others have said 20, 50, etc. Since you have the bitrate, I would bump up to 4K even if it is an artificial upscale. Ahah well that post is now a bit irrelevant anyway cause the app allows to export at lower bitrates now. 5-2 but rarely use h. You can do a lot with AVC. I've been uploading my YouTube videos at 100,000 kbps bitrate, is this overkill since it seems every site says 15,000 kbps should be fine? I noticed in the past that if I went with the "Best" setting for bitrate in Resolve that my videos came out kind of looking like crap once uploaded to youtube so i manually set it to 100,000 kbps and they come out looking great once uploaded but I'm AV1 can be monstrously good, but as I consume all my personal media via Plex, it's a no go. With my 3080 I use hevc VBR 35-45k at p5 and get pretty good quality. Members Online Login to twitch in the stream tab and click ignore streaming service recommendations. This swings between 3. 265 with those settings. Best to let it run overnight. With the 4K to 1080p HEVC encode, I ended up with a 3. Feb 3, 2022 · Since you have an Nvidia card you can use HEVC. Generally speaking, HEVC will give you the same quality at about half the bitrate of AVC, and as a result, an HEVC video will usually be significantly smaller than the same video encoded with AVC without losing quality. tjscz dks xvnlbqyk epdmalm idhdv rudkrc myqn zenrnyi ssdw vlfshn