Jump to content

Our website is made possible by displaying online advertisements to our visitors.
Please consider supporting us by disabling your ad blocker.

RngdQc

Deactivated
  • Content count

    111
  • Avg. Content Per Day

    0
  • Donations

    $35.00 
  • Joined

  • Last visited

1 Follower

About RngdQc

  • Rank
    Enthusiast
  • Birthday 03/28/1988

Profile Information

  • Gender
    Male
  • Location:
    montréal

Gaming Info

  • IGN
    RngdQc
  • Steam
    http://steamcommunity.com/id/RenegadeQc/
  • Battlelog ID
    http://battlelog.battlefield.com/bf3/user/RngdQc/

System Specs

  • OS
    Win 7 64 bit
  • Mobo
    M5A99X EVO (AM3r2)
  • Processor
    AMD Phenom II X4 975 OC @ 4ghz
  • Graphics
    EVGA GTX 550 Ti FPB x2 (SLI) OCd
  • PSU
    Cooler Master eXtreme Power Plus 700W
  • RAM
    Kingston PC3-10700 8g 1600
  • Storage
    hdd1 500GB / hdd2 2TB
  • Audio
    onboard 5.1
  • Monitor
    ASUS VE276
  • Case
    Cooler Master HAF 922
  • Peripherals
    mouse: Cyborg RAT7, keyboard: Cyborg v7, headset: sirus cm storm, mousepad: razer vespula, speakers: logitech Z906
  • Speed Test
    http://www.speedtest.net/result/2833701754.png
  1. [b]Name:[/b] EVE Online: Rubicon Trailer [b]Category:[/b] Video Games [b]Date Added:[/b] 14 November 2013 - 05:45 PM [b]Submitter:[/b] [url=http://www.adkgamers.com/user/4651-rngdqc/]RngdQc[/url] [b]Short Description:[/b] EVE Online: Rubicon Trailer [b][url=http://www.adkgamers.com/videos/view-53-eve-online-rubicon-trailer/]View Video[/url][/b]
  2. ADK tag for bf4

    i mean the tag to the right of your soldier name. yes i played bf3... way too long ago. lol
  3. ADK tag for bf4

    what's the current adk tag for the bf4 servers? cant put 5 characters so =ADK= doesnt work
  4. AMD New line of GPU's

    so you buy something and dont want to get all the power you can get from it? weird :P
  5. G-sync!

    it's going to be compatible with older models, not all of them, but some, and compatible with pretty much any nvidia card that works with kepler architecture
  6. G-sync!

    AND this was my 100th post! YAY GRATS TO ME lolz :)
  7. G-sync!

    http://www.extremetech.com/gaming/169091-nvidias-g-sync-promises-better-faster-monitors-that-will-revolutionize-pc-gaming Nvidia has demonstrated a new display refresh technology that’s meant to move v-sync (vertical synchronization) out of the stone age and create a solution better suited to the needs of modern displays. In doing so, it’s exposed a fundamental problem of technology — oftentimes, our standards aren’t based on any sort of objective evaluation of what’s “good,” but simply built on what worked at a given point in time. 24 frames-per-second in film was standardized because it was fast enough for eyes to perceive as motion, fast enough to keep the highly combustible film from igniting due to exposure to the projection lamp, but slow enough not to cost enormous amounts of money to shoot one movie. The 60 Hz refresh rate we’re all familiar with was standardized because vacuum tube technology needed to run at a multiple of the AC frequency. When we moved to LCDs, we shifted away from using an electron gun to redraw the screen, but still redraw the screen a given number of times per second. Nvidia wants to fix that with its new G-Sync display technology. The issue, in a nutshell, is that graphics cards don’t render at fixed speeds. While we’ve discussed this our coverage of frame latency issues, those discussions have focused entirely on the video card side of the equation — how long it takes the GPU to draw and output each frame, and how variations in that timing can lead to suboptimal displays. The entire reason we use v-sync, for example, is because v-sync prevents tearing. With v-sync off, you can end up with visual displays that look like this: That’s with v-sync off, which means the video card shoves new images to the monitor as quickly as it can. The monitor, in turn, updates as quickly as it can, with no regard for whether the image being overwritten is fully synchronized at top and bottom. V-sync fixes this by limiting the upper frame rate. You can buy a display with a 60-144 Hz refresh rate (the 144 Hz displays are “true” 144 Hz and do not use interpolation as some high-end televisions do.) But refreshing the screen at a higher frame rate doesn’t fix the problem that v-sync starts tearing again if the frame rate dips below the set boundary, too. Nvidia has previously attempted to fix this on the GPU side, by integrating what it calls Adaptive V-Sync, but G-Sync is something different. Instead of just being based on GPU-side timing, G-Sync is a physical chip that integrates directly into a monitor. G-Sync is compatible with all Kepler-based graphics cards, and should be compatible with all Nvidia GPUs going forward. According to NV, G-Sync synchronizes the graphics card to the monitor rather than the monitor to the graphics card, and promises such smooth gameplay that internal testers “have found themselves overshooting and missing targets because of the input and display lag that they have subconsciously accounted for during their many years of gaming.” By only drawing frames once they’re ready, the display allows for variable frame rates and smooth playback at the same time. Feedback from people who have seen the system in person has been enthusiastic. Nvidia’s G-Sync includes a 768MB buffer combined with a custom FPGA Nvidia plans to make the technology available two different ways. First, those of you with existing monitors — otherwise known as “everyone” — will be able to upgrade an Asus VG248QE display with a standalone kit. No word on whether everyone who doesn’t own a VG248QE display will be able to upgrade or not, or whether the upgrade will be available on Asus monitors in that family at the 27-inch size. Otherwise, you’ll be able to buy a monitor next year, at resolutions of 1920×1080 all the way up to 4K. Given the feedback from testers, it seems like this could be a major boon for the gaming industry — and, coincidentally, it’s an NV-only feature. If you think about it, this is damned smart of Nvidia. While a G-Sync module will presumably allow a monitor to work with any video card (just like normal), there’s no reason to think an AMD GPU will be able to hook into the feature and use the specialized capabilities. Since most gamers tend to upgrade video cards every 2-3 years but may use a display for considerably longer, this increases the chance of a person buying several video cards from Team Green in a row. AMD will almost inevitably answer this kind of project with its own initiative, possibly as an open-source initiative. Whether gamers will want to pay premiums for G-Sync tech is fair question, but I suspect a number will — after all, improved image quality is why people ostensibly buy into monitors, and the boost here, according to all sources, is quite significant.
  8. i used my credit card for this http://www.newegg.ca/Product/Product.aspx?Item=N82E16814130948
  9. WOOHOO!! lol, that means i need a new videocard, i sure aint going to play on medium settings
  10. dafuq did i just watch?! rofl almost 100 million views, ffs, i lost hope in humanity.
  11. Juno!

    http://missionjuno.swri.edu/origin?quality=high in 22 hours 45 minutes, Juno will slingshot itself with the help of earth's mass (gravity) to reach Jupiter in 2016!! isn't that just crazy! gotta love science <3
  12. they obviously have to.. sounds like EA is acting like microsoft with the xbox one's DRM.. might not be a good sign..
  13. i used to be able to play bf3 at almost all ultra settings, i tried bf4 beta, and i must say that im honestly disappointed.. the lag part is the worst, but had texture issue also... ive been waiting to pre order until beta comes out.. so far, for me, no bf4..  ill be checking out once in a while if anything changes.. sucks because bf3 was the reason i joined ADK.. ill stick with Eve for now... but yeah, each time EA touches something, you can be sure it's going to be a pain in the ass... just look at what happened to simcity.. and the lack of information/communication from dice/EA is lame.. fan/community driven? uhh no... money driven lol
  14. AMD New line of GPU's

    dont forget nvidia is coming out with Maxwell architecture for their 800 series, with integrated ARM CPU (2014) http://en.wikipedia.org/wiki/GeForce_800_Series
  15. SteamOS

    http://blogs.nvidia.com/blog/2013/09/25/steam-rolling-into-your-living-room/   The community, content and convenience of Steam are en route to a living room near you. And best of all, it’s free. Valve is one of the most innovative forces in the gaming industry, and that sentiment is amplified by Monday’s announcement of SteamOS. As a long-time PC gamer, no conversation about the pillars of the game industry would be complete without mention of Valve Software. From their award-winning and innovative games to the leading PC game distribution and community platform Steam, Valve is synonymous with PC gaming, and the ability to innovate. Now Valve is at it again! The concept of SteamOS from Valve really hits home with me because there are three things I’m very fond of: big screens, PC games and customer choice. SteamOS is an elegant way to get your PC games into your living room and onto your biggest screen. SteamOS is built around the already familiar Steam and is a version of Linux. But it’s enhanced for gaming — and for gaming on the big screen in particular. Combined with the fact they’re giving it away to users, and hardware providers, for free, SteamOS has the potential to usher in a new era for gaming in the living room. This means that anyone can build hardware and software for use in the living room, on an operating system designed to be lightweight, extensible and optimized for gaming. Suffice to say, we here at NVIDIA are very excited! Engineers from Valve and NVIDIA have spent a lot of time collaborating on a common goal for SteamOS: to deliver an open-platform gaming experience with superior performance and uncompromising visuals directly on the big screen. NVIDIA engineers embedded at Valve collaborated on improving driver performance for OpenGL;  optimizing performance on NVIDIA GPUs; and helping to port Valve’s award-winning content library to SteamOS; and tuning SteamOS to lower latency, or lag, between the controller and onscreen action. The collaboration makes sense as both companies strongly believe in the importance of open-platform innovation, and both companies are committed to providing gamers with a cutting-edge visual experience. Valve will deliver a great, open-platform gaming experience, and NVIDIA will continue to be the best choice for gaming on any open platform or operating system, including SteamOS.
×

Important Information

This website uses cookies to provide the best experience possible. Privacy Policy & Terms of Use