What's A Safe Temperature For Mining? - CryptoVoid

My GPU runs hotter when Idle. The moment I move the mouse it starts cooling down. (Win10 v1903 Build 18362)

Ok so I'm using EVGA Precision X1 to monitor temps and to set my fan curve, works fine. But I noticed that when I go away for a bit and come back my gpu temp will be sitting around 55 degrees with the fan going at ~65%. The SECOND I move my mouse, it quickly starts dropping temp till it sits at 20-25 degrees.
The first thing I can think of is that I have something trying to sneakily bitcoin mine in the background but scans have come up clean. What can I do to see what is responsible for this?
Cheers.
Edit: Solved: Malware bitcoin miner had taken over the file. Quarantined it and the issue disappeared.
submitted by joelkemu to techsupport [link] [comments]

Complete Guide to OverdriveNTool

We present the complete guide to overclocking GPUs with OverdriveNTool for your Ethereum Mining Rig! In this special we will write a complete guide to OverdriveNTool, in our opinion the most efficient, fast and immediate software for overclocking GPUs dedicated to mining.
The interface is presented in a very simple and no-frills way, as if to suggest how much the program was created to go directly to the purpose.
We remind you that after installing the drivers (see our guide to build a 6 GPU Ethereum Mining Rig) you will need to go through the Radeon Settings (Radeon Settings), select Game, Global Settings and for each GPU in your mining rig (or mining rig) you will need to make sure that HBCC memory is disabled. Do the same with the Crossfire option, checking that it is also disabled. Reboot the system and verify that all video cards have indeed not enabled HBCC and Crossfire before proceeding.
At the following link the software download and technical specifications: https://forums.guru3d.com/threads/overdriventool-tool-for-amd-gpus.416116/
Recall that the GPUs in Atiflash will numerically correspond to the GPUs in ONT and Claymore, without misalignment.
First we open our BIOS previously modified with Bios Polaris or, possibly, a stable Bios Mod downloaded from specialized sites such as Anorak via ONT. However, we can also overclock the original Bios of the GPU. Follow the OverdriveNTool guide carefully when operating at these levels!
Click on New to create a new profile for the selected GPU. At first you will find yourself on the 0 which will correspond to the 0 in Atiflash and Claymore. I repeat once again: identical GPUs can behave differently; for this reason, the most stable final overclocking may vary from card to card. It will be sufficient to load the first profile on each subsequent tab, select New, make the necessary changes and save it with a different name (possibly recognizable, such as GPU1-OC Memory or GPU2-Temp, etc ...).
The stages of the GPU and Ram. On the left we find the stages or clocks of the GPU with relative voltage for each sector. Some users disable the first 6 stages (from P1 to P6) to ensure that once the command for the minion is executed, the GPU immediately goes to the last stage. For those who, like us, restart the RIG once every 2 or 3 days, or even more, it is an unnecessary procedure.
We recommend, at least for the first tests, to leave them activated. Once you have reached the limit of the video card, you can check whether disabling them will bring some improvement in terms of hashing on the screen without the pool being affected. Because in effect our goal is to have a high hash-rate and with a minimum percentage of errors on the pool even at the expense of a lower hash-rate in our RIG.
In the central part we find the speed of the memory divided into 3 sectors. We will operate directly on the latter.
On the right you can see the speed of the fans, the temperature that the fans must maintain (in our bios-mode it is set at 75 ° to which we obviously never arrived), the acoustic limit (in a RIG it is a parameter to always keep consideration).
The last section at the bottom right, the Power, is divided into the maximum reachable temperature (with our Pulse set at 84 ° while with the XFX at 75 °) and the Power Target, strictly linked to the modified Bios that we are overclocking . You can try at the end of all tests, in the event of instability of one or more GPUs, to give less power starting from -25%.
In this guide we will refer to the XFX RX 580 8GB GDDR5, with GPU clock at 1200Mhz and Memory at 2150Mhz. 8 video cards theoretically identical in total.
Let's put into practice what has been written up to now ...
We immediately opted for blocking the stages by operating directly on the latter for both the GPU Clock and the RAM. From these levels it starts to drop with the voltage of both the GPU and the RAM, alternatively always checking hashing, consumption and the stability of the system (usually 5-10 minutes are enough). When the voltage is too low, the GPU will not start undermining.

The goal is to obtain the best performance / consumption ratio, always parameterizing the results obtained on the pool. A very high hashrate or very low consumption can often create numerous errors in the mining phase.


With 8 RX580 8GB video cards we reached a total consumption (thus including all the components of the RIG) of 770 Watts for an average of less than 100 Watts per GPU.

The result was achieved by bringing the GPU clock voltage to 1000 and the RAM to 900. Lower values ​​are theoretically possible but could cause system instability. As mentioned previously, each video card is different from the others and on one of the eight GPUs we were forced to lower the power by 25%.

After these tweaks, we got results on the pool with a hashrate often higher than 240mhs.


We would like to emphasize that GPU overclocking is the absolute operation that will take you the longest time. It can take hours to reach the so-called "sweet spot" of each video card. Our OverdriveNTool guide will surely help you!

But this achievement will give you great satisfaction, we guarantee it.
Below the stable settings for the RX Vega 64 video cards of our 13 GPU Mining Rig of which you can see some videos on our YouTube channel: https://www.youtube.com/channel/UCdE9TTHAOtyKxy59rALSprA

Complete Guide to OverdriveNTool
See you soon for the next guide dedicated to mining!

If you liked this article and would like to contribute with a donation:

Bitcoin: 1Ld9b165ZYHZcY9eUQmL9UjwzcphRE5S8Z
Ethereum: 0x8D7E456A11f4D9bB9e6683A5ac52e7DB79DBbEE7
Litecoin: LamSRc1jmwgx5xwDgzZNoXYd6ENczUZViK
Stellar: GBLDIRIQWRZCN5IXPIKYFQOE46OG2SI7AFVWFSLAHK52MVYDGVJ6IXGI
Ripple: rUb8v4wbGWYrtXzUpj7TxCFfUWgfvym9xf
By: cryptoall.it Telegram Channel: t.me/giulo75 Netbox Browser: https://netbox.global/PZn5A
submitted by Giulo75 to u/Giulo75 [link] [comments]

Watercooling dynamics

I recently started to mine bitcoins, so the pc is at full load all the time. I control the fans with the water temperature. I tried different settings: first, I set the fans to 100%@35c. The fans were always on 100% and sometimes the temps exceeded 35c. When I set the fans to run 100%@50c, the temperature went to 40c but the fans now run @50-60%. The cpu/gpu temps haven't changed at all. How does this work? It seems like higher water temperature with slower fan speeds equals the same cooling capacity as faster fans with cooler water.
submitted by tonnentonie to watercooling [link] [comments]

Mining noob, I have some questions

Hi everyone, a quick intro here: I come from a professional horticulture background. I've been learning about computers, networking, network security and Linux sys. admin for the last two years. I built a bunch of gaming computers for my kids and I with a bonus check I got in fall of 2017, right before the 2017 "bitcoin bubble". By luck I grabbed all my parts before the price of GPU's skyrocketed. All I've been doing though is learning about Linux and game development, learning digital art like 3D modeling, and streaming video games.
I'm now learning to mine ZEC with tpruvot/ccminer 2.3.1 in Ubuntu 20.04 with Nvidia proprietary driver vers. 440 & CUDA toolkit 10.1. I'm just learning how to do this and understand I'm not making a profit. I'ts more a learning experience and a hobby sort of thing for now. I dont really care if the system breaks, I have another computer with AMD RX560 that I work and game on Linux with. I cant mine with the pollaris GPU because I cant install OpenCL. There is no support for 20.04 from catalyst driver as of now.
TL;DR I'm a noob and wondering why my hashrate is what it is. I am only using 1 GPU as of now (Nvidia 1050Ti 4GB) and mining on a pool. I get an average of 140 Sol/s. Is this essentially the same as H/s and is that a normal number for my card? Should I add a 2nd GPU I have if it's only a 1050 2GB? Also, I am using nvtop & htop packages to monitor PC stats, it shows it's using 99% of GPU and 100% of a single core of my CPU (intel i5 6402P @ 3.2GHz) fans and temps are good.
But it shows I'm only using .6GB / 4GB while mining, is that right? Shouldn't it be using more memory? Would it be overkill to mine with CPU miner at the same time as the 2 cards?
Sorry about the essay, and thanks for your time
submitted by starseed-pl to zec [link] [comments]

How to increase the hash rate on 1660ti.

#startmining

How to increase the hash rate on 1660ti.

MSI 1660ti is the best GPU out there in the market which is pocket friendly and also best for mining crypto currency. It has has out of box hashrate of about 22.7 Mhs but i have end up getting above 30.5 Mhs by overclocking it with MSI afterburner making both the mining rig and my system run smoothly.
I mine Ethereum with 2 MSI 1660ti graphic cards and i get over 61.8 Mhs which is apsolutely brilliant.
Overclocking settings i have used are as follows:
Core Voltage: 0%
Power limit: 70%
Temp limit: 72%
Core clock: -170
Memory clock: +1050
Fan speed: 60
My GPU 1 current temp : 49 degree celsius
My GPU 2 Current temp : 47 degree celsius
You can also download a free crypto browser and start earning free bitcoin today here: https://cryptotabbrowser.com/14685350
submitted by Sensitive-Court2050 to u/Sensitive-Court2050 [link] [comments]

WARNING: Andy Android emulator (AndyOS, Andyroid) drops a bitcoin miner on your system

lawrenceabrams has done a lot more digging and research and has published an article which you can read here.
Update: Their Facebook support group has been changed to a closed group meaning you can't view their posts if you're not already a member. Luckily I have a fair few sleeper accounts in that group and I'll report back with anything worth noting.
Clarification: In the video I fail to close Andy when checking my GPU stats but I can confirm that they are roughly the same as when Andy is open. The mining process runs even with Andy closed and it opens on startup. I use the term bitcoin in this thread and the video as it's almost become a generic trademark. People instantly know what bitcoin is. I used cryptocurrency when talking to people in the Andy support group and they got confused and thought I was talking out my arse.
MAJOR UPDATE: I asked the Andy staff why they're still serving the infected file. After seeing that comment, and probably after seeing this reddit thread they've removed me from the group.
A friend opened Andy in process explorer to see the files it drops upon installation. By the looks of things, the installer isn't at fault. Andy itself calls an IP which then transfers the bitcoin miner to your system.
Andy clearly have no interest in fixing this issue and they're doing their best to censor it. At this point I wouldn't be surprised if this is completely down to their doing. The fact that they've completely blocked me from contacting them and the removal of all of my posts to them suggests that they don't care and don't want anyone to know.
Please keep in mind that this may not directly be Andy's fault. I'm not trying to directly accuse Andy of being at fault here but until an official statement is made from the Andy team I'm going to tell it how it is, and how the majority of people will see this situation. The installer Andy uses drops a cryptocurrency miner on your system and it has been reported in the past but no effort has been taken to cut ties with the company that created the installer. This is still Andy's responsibility. Funnily enough, the owners of Andy and the admins in the Andy support Facebook group actually recommend turning off your antivirus whilst installing.
All evidence provided on this post is true with version 'Andy_Nougat_260_1096_26' (latest release available from the official Andy website).

Backstory

I was searching for an Android Emulator and came across an Android Authority list of the 15 best Android emulators for PC (now 14 after I contacted the writer of the article with evidence). I saw Andy was on this list and it was described as a big competitor to the likes of Bluestacks. I'd used Bluestacks previously but I was looking for a different emulator just to try something new. I downloaded Andy, installed it (I declined the offer relating to Yahoo), and began using it. I finished up what I was doing, closed Andy and opened some games. I noticed that in every single game I played I suffered major FPS drops at seemingly random times. I checked my GPU usage and temps and noticed they were working at roughly 80% load and 80+ degrees C whilst gaming. Very unusual for my setup. I opened task manager and sorted it via what was using the most GPU power and found a process named 'updater.exe'. After further inspection I noticed that this installed along with Andy.

Evidence

I created a video showcasing the entire installation process, including GPU usage before and after Andy was installed. This was sent directly to the creators of Andy (which is who I'm referencing in the video), as they refused to believe that the bitcoin miner was anything to do with installing their software. Apparently giving them virustotal scans and screenshots are not enough evidence and some users in the Andy support Facebook group blindly tried accusing me and my friends of using a tampered installer. The video shows that I downloaded every single executable possible from their official website and I was served the same installer each time.

How to remove Andy

Removing Andy and the bitcoin miner is actually really easy. The miner doesn't even attempt to hide itself and doesn't have a specific payload so it's just always running.
  1. Close every Andy-related process via task manager.
  2. Uninstall Andy via Windows
  3. Look for a process named 'Updater' (This is the miner and surprisingly enough won't be uninstalled when you uninstall Andy! Would you believe it!)
  4. Right click that process and click 'Go to details'
  5. Right click 'Updater.exe' in details and click 'End process tree'
  6. Navigate to C:\Program Files (x86)
  7. Click once on the folder named 'Updater' and then press Shift+Delete
  8. Click once on the folder named 'AndyOS' and then press Shift+Delete
  9. Recheck task manager to confirm no more Andy services are running
  10. Download Malwarebytes and perform a full system scan to check if anything was missed
  11. Download CCleaner and do a registry fix. Multiple Andy registry entries will be found. Delete these and scan again to ensure that nothing was missed

Why didn't my antivirus detect it?

The likelihood is that your antivirus probably thought you wanted it. If every antivirus detects bitcoin miners as a threat then it's only going to get in the way of people who genuinely want to mine bitcoins on their system for personal use.

What now?

The Andy development team claim they are 'looking into this', but it has been reported to them in the past and nothing has changed at all. It has been removed off of the list of best Android emulators by Android Authority after I contacted the writer of the article with this evidence. He also installed Andy and confirmed that something fishy is going on. Even after being provided with evidence, the infected installer is still served today from their website.

Andy devs giving conflicting stories

Someone working for Andy by the name of Ghazi has been urging people to stop spreading the claims that Andy installs a bitcoin miner by saying that Andy doesn't mine for bitcoins and that we've been using an older version, which uses a similar method as Andy requires something to do with blockchain technology. This makes no sense. I don't understand why a modified ROM and basic application that hooks into a virtual machine would require anything to do with blockchain technology. Another reason this makes no sense is that the OWNERS of Andy said that it shouldn't be there, and that it's not their fault because they use a third party installer provided by another company. Two very conflicting stories.

TL;DR

In summary, when you install Andy from their official website, you 100% receive a bitcoin miner.
I will update this post with any further advancements.
Edit: The thing Ghazi was talking about is a deprecated ‘Andy Cloud Experiment’ which is no longer in use. They are still looking into the current issue but are still serving the infected file.
Edit: After being banned from their support group I got in on another account. I made a post and when I told them who I was they instantly banned me again. Fantastic! Great guys! Professionals!
Edit: Joined on a third account and was banned again! What a surprise!

In the news:
Betanews: https://betanews.com/2018/06/18/andy-os-bitcoin-mine
submitted by TopWire to emulators [link] [comments]

RX 5700 (NON XT) Mining Performance (7/28/2019)

**Disclaimer**
[Still testing, and Tuning but the new AMD RDNA Architecture is new and not only is AMD still optimizing drivers, the mining Developers who DO NOT get GPU's sent to them, are still working on optimizations. Please be patient with me as I continue to test and allow sufficient time for new miners to be developed.]
Same stuff different day just as with the RX 590 Fatboy and RTX 2080, I will be testing the RX 5700 over time as new miners come out, to compare price to performance for mining. Below are some of my results when testing the new AMD RX 5700 (Non XT) graphics card mining performance, now I was only able to get a few working. I did some videos on its Gaming performance and the "SoftPowerPlayTables" mod from Igor's Lab at Tom's Hardware, which allowed the RX 5700 to really stretch its legs. Allowing this Non-XT model to surpass the RTX 2060 Super and even get on par with the first Gen RTX 2070. Moving forward, as new miners are release I will update my numbers and test when I can.
***UPDATE: 7/31/19 - New Phoenix Miner 4.5c still only getting 2 - 4 Mhs, XMR Stak 2.10.7, only Algo that will run is RYO
***UPDATE: 9/15/19 - Updated Power Draw numbers, as my Watt Meter died, new one in and retested Algos below
***UPDATE: 12/14/19 - Updated and added Algos as miner support was implemented. Retesting with Radeon Adrenalin 2020 driver
***UPDATE: 1/22/19 - Updated additional miners as support was implemented. Retesting with Radeon Adrenalin 2020 driver (20.1.3)
RX 5700 GPU
Driver Currently in Use:
Mining Performance AMD DRIVER - Adrenalin Edition 19.9.1
OverdriveNTool 0.2.8
Average temps during mining
Stock Setup: 65c - 72c
Aggressive Fan Curve: 40% - 75%
Algo (Mining Program) / OC settings (volt mV) / Power draw

Claymore Miner (Updates will Follow) [ UPDATED 9/15/2019 got new Kill-A-Watt Meter ]
ETH (Claymore Miner V 15) STOCK*** 49.5 MHs 1750 Core (1037 mV) / Mem 1750 (850mV) 155 Watts
ETH (Claymore Miner V 15) SPPT Mod*** 51 MHs 1900 Core (1037mV) / Mem 1800 (850mV) 155 Watts
ETH (Claymore Miner V 15) SPPT Mod*** 53.2 MHs 1750 Core (990mV) / Mem 1850(850mV) 160 Watts
ETH (Claymore Miner V 15) SPPT Mod*** 53.5 MHs 1750 Core (990mV) / Mem 1860 (850mV) 160 Watts
ETH (Claymore Miner V 15) SPPT Mod*** [Best Config] 52.6 MHs 1325Core (900mV) / Mem 1860 (850mV) 115 Watts

Claymore Miner (Updates will Follow) [ UPDATED 9/15/2019 got new Kill-A-Watt Meter ]

ETH (Phoenix Miner) STOCK*** 48.8 MHs 1750 Core (1037 mV) / Mem 1750 (850mV) 155 Watts
ETH (Phoenix Miner) [Best Config] 53.4 Mhs 1250 Core (750 mV) / Mem 1850 (850 mV) 115 Watts

ProgPow | BCI - Bitcoin Interest (ethminer not working on Navi ATM)
Phoenix Miner 4.9c Stock 1.44 Mhs 1750 Core (1018mV) / Mem 1750 (850mV) 140 Watts
Phoenix Miner 4.9c 1.291 1500 Core (805mV) / Mem 1850 (850mV) 96 Watts
WildRig Multi Miner
Blake2b (WildRig Multi (0.20.1) 1.86 Ghs 1750 Core (1018mV) / Mem 1750 (850mV) 132 Watts
Blake2s (WildRig Multi (0.20.1) 4.7 Ghs 1750 Core (1018mV) / Mem 1750 (850mV) 128 Watts
BMW512 (WildRig Multi (0.20.1) 1.05 Ghs 1750 Core (1018mV) / Mem 1750 (850mV) 130 Watts
Lrya2Rev3 (WildRig Multi (0.20.1) 61.4 Mhs 1750 Core (1018mV) / Mem 1750 (850mV) 160 Watts
Lrya2Rev2 (WildRig Multi (0.19 Beta) 1.04 Khs 1750 Core (1018mV) / Mem 1750 (850mV) 160 Watts
Lrya2vCoban (WildRig Multi (0.19 Beta) 54.5 Mhs 1750 Core (1018mV) / Mem 1750 (850mV) 160 Watts
MTP (WildRig Multi (0.20.1) 2.5 Mhs 1750 Core (1018mV) / Mem 1750 (850mV) 152 Watts

XMR-Stak (Updates will Follow)
Cryptonight-GPU - RYO (XMRStak 2.10.8) 1.42 Khs (1 Thread) 1750 Core (1018 mV) / Mem 1750 (850mV) 132 Watts
Cryptonight-GPU - RYO (XMRStak 2.10.8) 1.62 Khs (2 Threads) 1750 Core (1018 mV) / Mem 1750 (850mV) 155 Watts
Cryptonight-GPU - RYO (XMRStak 2.10.8) 1.89 Khs (2 Threads) 1900 Core (1018 mV) / Mem 1750 (850mV) 160 Watts
Cryptonight-GPU - RYO (XMRStak 2.10.8) 1.89 Khs (2 Threads) Undervolt 1900 Core (1000 mV) / Mem 1750 (850mV) 155 Watts
Cryptonight-Conceal - CCX (XMRStak 2.10.8) 2.23 Khs (1 Thread) 1750 Core (1018 mV) / Mem 1750 (850mV) 116 Watts

XMR-Stak - Cryptonight-R
Cryptonight-R (XMRStak 2.10.7) 1.1 Khs (2 Threads) Undervolt 1325Core (800mV) / Mem 940 (850mV) 90 Watts
Cryptonight-R (XMRStak 2.10.7) 1.145 Khs (2 Threads) 1750 Core (1018 mV) / Mem 875 (850mV) 128 Watts

LOL miner - Grin 29
Grin29 (LoLMiner 0.8.8) 5.2 G/s 1750 Core (1037 mV) / Mem 1750 (850mV) 128 Watts
Grin31 (LoLMiner 0.9.3) 0.95 G/s 1750 Core (1037 mV) / Mem 1750 (850mV) 130 Watts

RX 5700 Mining Performance (WildRig - XMRStak)
submitted by cmvjax to gpumining [link] [comments]

I literally have tens of thousands of dollars in top-shelf hardware, looking to repurpose some before selling on eBay to build a NAS system, possibly a dedicated firewall device as well. o_O

Q1) What will you be doing with this PC? Be as specific as possible, and include specific games or programs you will be using.**

A1) This will be a dedicated NAS system for my home network. As such, I'm looking to have it:

- Host ##TB's of 720, 1080 & up resolution Movies and TV Shows I'm about to begin ripping from a MASSIVE DVD & Blueray collection I have.

- My kids are big on Minecraft. I understand it's possible to host your own "worlds" (or whatever they call the maps you can build) on your own "server". I think it would be pretty neat to offer them (& their friends - if can be done 'safely/securely') their own partition on one of my NAS HDD's.

- I also have accounts with a couple diff VPN companies... I understand it's possible (?) to sync said VPN's with a NAS, this might be a more relative topic on the next point/purpose...

- I'd like to be able to remotely link to this NAS for when I travel overseas and want to stream at my temp location from my house/this NAS.
______________________
Q2) What is your maximum budget before rebates/shipping/taxes?**

* A2) Here's where I make matters more complicated than most others would... I've been an advocate for Bitcoin and crypto-currencies in general since 2013. I invested in a small mining outfit back in 2014 (strictly Bitcoin/ASIC's). One of my buddies is the President of a large-scale mining operation (foreign and domestic) and he convinced me to dabble in the GPU mining-space. I made my first hardware purchase in Q4, 2017 and launched a small-scale GPU-Farm in my house since then. I had the rigs mining up until Q3 of 2018 (not cost-efficient to keep on, especially living in SoFlo) and since then, the hardware's been collecting dust (& pissing off my family members since they lost access to 3X rooms in the house - I won't let anyone go near my gear). One of my New Years Resolutions for 2019 was to clear out the house of all my mining equipment so that's all about to go up on eBay. So "budget" is relative to whatever I "MUST" spend if I can't repurpose any of the parts I already have on hand for this build... (Anyone having something I "need" and is looking to barter for one of the items I'll list later on in here, LMK).
______________________
Q3) When do you plan on building/buying the PC? Note: beyond a week or two from today means any build you receive will be out of date when you want to buy.**

A3) IMMEDIATELY! :)
______________________
Q4) What, exactly, do you need included in the budget? (ToweOS/monitokeyboard/mouse/etc\)**

A4) Well I had a half-assed idea approximately 1 year ago that it might be wise to build a bunch of 'gaming rigs' to sell on eBay with my intended repurposed mining hardware so I went on a shopping spree for like 6 months. That said; I've got a plethora of various other components that aren't even unboxed yet. 90% of the items I've purchased for this additional project were items that were marked down via MIR (mail-in-rebates) & what-not...
AFAIK, there are only 3X items I absolutely do not have which I 'MUST' find. Those would be - 1) Motherboard which accepts "ECC RAM". 2) CPU for said MOBO. 3) Said "ECC RAM".\* 
______________________
Q5) Which country (and state/province) will you be purchasing the parts in? If you're in US, do you have access to a Microcenter location?**

A5) I'm located in Southwest Florida. No Microcenter's here. Best Buy is pretty much my only option although I am a member of Newegg, Amazon & Costco if that makes any difference?
______________________
Q6) If reusing any parts (including monitor(s)/keyboard/mouse/etc), what parts will you be reusing? Brands and models are appreciated.**

A6) In an attempt to better clean up this Q&A, I'm going to list the items I have on-hand at the end of this questionnaire in-case passers-by feel like this might be a TLDR.* (Scroll to the bottom & you'll see what I mean).
______________________
Q7) Will you be overclocking? If yes, are you interested in overclocking right away, or down the line? CPU and/or GPU?**

A7) I don't think that's necessary for my intended purpose although - I'm not against it if that helps & FWIW, I'm pretty skilled @ this task already (it's not rocket science).
______________________
Q8) Are there any specific features or items you want/need in the build? (ex: SSD, large amount of storage or a RAID setup, CUDA or OpenCL support, etc)**

A8) As stated in A4; ECC RAM is non-negotiable... RAID seems like a logical application here as well.

- This will predominantly be receiving commands from MacOS computers. I don't think that matters really but figured it couldn't hurt to let you guys know.\*

- I'd also be quite fond of implementing "PFSENSE" (or something of that caliber) applied to this system so I could give my Netgear Nighthawks less stress in that arena, plus my limited understanding of PFSENSE is that it's ability to act as a firewall runs circles around anything that comes with consumer-grade Wi-Fi routers (like my Nighthawks). Just the same, I'm open to building a second rig just for the firewall.\*

- Another desirable feature would be that it draws as little electricity from the wall as possible. (I'm EXTREMELY skilled in this arena. I have "Kill-A-Watts" to test/gauge on, as well as an intimate understanding of the differences between Silver, Gold, Platinum and Titanium rated PSU's. As well as having already measured each of the PSU's I have on-hand and taken note of the 'target TDP draw' ("Peak Power Efficiency Draw") each one offers when primed with X amount of GPU's when I used them for their original purpose.\*

- Last, but not least, sound (as in noise created from the rig). I'd like to prop this device up on my entertainment center in the living room. I've (almost) all of the top-shelf consumer grade products one could dream of regarding fans and other thermal-related artifacts.

- Almost forgot; this will be hosting to devices on the KODI platform (unless you guys have better alternative suggestions?)
______________________
Q9) Do you have any specific case preferences (Size like ITX/microATX/mid-towefull-tower, styles, colors, window or not, LED lighting, etc), or a particular color theme preference for the components?**

A9) Definitely! Desired theme would be WHITE. If that doesn't work for whatever reason, black or gray would suffice. Regarding "Case Size". Nah, that's not too important although I don't foresee a mini-ITX build making sense if I'm going to be cramming double digit amounts of TB in the system, Internal HDD's sounds better than a bunch of externals plugged in all the USB ports.
______________________
Q10) Do you need a copy of Windows included in the budget? If you do need one included, do you have a preference?**

A10) I don't know. If I do need a copy of Windows, I don't have one so that's something I'll have to consider I guess. I doubt that's a necessity though.
______________________
______________________
______________________
**Extra info or particulars:*\*

AND NOW TO THE FUN-STUFF... Here's a list of everything (PARTS PARTS PARTS) I have on-hand and ready to deploy into the wild &/or negotiate a trade/barter with:

CASES -
Corsair Carbide Series Air 540 Arctic White (Model# Crypto-Currency-9011048-WW) - (Probably my top pick for this build).
Cooler Master HAF XB EVO (This is probably my top 1st or 2nd pick for this build, the thing is a monster!).
Cooler Master Elite 130 - Mini ITX - Black
Cooler Master MasterBox 5 MID-Tower - Black & White
Raidmax Sigma-TWS - ATX - White
MasterBox Lite 5 - ATX - Black w/ diff. Colored accent attachments (included with purchase)
NZXT S340 Elite Matte White Steel/Tempered Glass Edition
EVGA DG-76 Alpine White - Mid Tower w/ window
EVGA DG-73 Black - Mid Tower w/ window (I have like 3 of these)

______________________
CPU's -
***7TH GEN OR BELOW INTEL's ("Code Name Class mentioned next to each one)**\*
Pentium G4400 (Skylake @54W TDP) - Intel ARK states is "ECC CAPABLE"
Celeron G3930 (Kaby Lake @ 51W TDP) - Intel ARK states is "ECC CAPABLE" :)
i5 6402P (Skylake @65W TDP) - Intel ARK states is "NOT ECC CAPABLE" :(
i5 6600k (Skylake @ 91W TDP) - Intel ARK states is "NOT ECC CAPABLE" :(
i7 6700 (Skylake @ 65W TDP) - Intel ARK states is "NOT ECC CAPABLE" :(
i7 7700k (Kaby Lake @ 95W TDP) - Intel ARK states is "NOT ECC CAPABLE" :(


***8TH GEN INTEL's **\*
i3-8350K (Coffee Lake @91W TDP) - Intel ARK states is "ECC FRIENDLY" :)
I5-8600K (Coffee Lake @95W TDP) - Intel ARK states is "NOT ECC CAPABLE" :(


***AMD RYZEN's **\*
Ryzen 3 2200G
Ryzen 5 1600
Ryzen 7 1700X

______________________
MOTHERBOARDS -

***7TH GEN AND BELOW INTEL BASED MOBO'S - **\*
MSI Z170A-SLI
ASUS PRIME Z270-A
ASUS PRIME Z270-P
ASUS PRIME Z270-K
EVGA Z270 Stinger
GIGABYTE GA-Z270XP-SLI
MSI B150M ARCTIC
MSI B250M MICRO ATX (PRO OPT. BOOST EDITION)

***8TH GEN INTEL BASED MOBO'S - **\*
EVGA Z370 FTW
GIGABYTE Z370XP SLI (Rev. 1.0)
MSI Z370 SLI PLUS


***AMD RYZEN BASED MOBO'S - **\*
ASUS ROG STRIX B350-F GAMING
MSI B350 TOMAHAWK
MSI X370 GAMING PRO
ASROCK AB350M PRO4
______________________


RAM -

Way too many to list, nothing but 4 & 8GB DDR4 sticks and unfortunately, none are ECC so it's not even worth mentioning/listing these unless someone reading this is willing to barter. At which time I'd be obliged to send an itemized list or see if I have what they're/you're specifically looking for.\*
______________________
THERMAL APPLICATIONS/FANS -
JUST FANS -
BeQuiet -
Pure Wings 2 (80mm)
Pure Wings 2 (120mm)
Pure Wings 2 (140mm)
Silent Wings 3 PWM (120mm)

NOCTUA -
PoopBrown - NF-A20 PWM (200mm) Specifically for the BIG "CoolerMaster HAF XB EVO" Case
GREY - NF-P12 Redux - 1700RPM (120mm) PWM
Corsair -
Air Series AF120LED (120mm)

CPU COOLING SYSTEMS -
NOCTUA -
NT-HH 1.4ml Thermal Compound
NH-D15 6 Heatpipe system (this thing is the tits)

EVGA (Extremely crappy coding in the software here, I'm like 99.99% these will be problematic if I were to try and use in any OS outside of Windows, because they barely ever work in the intended Windows as it is).
CLC 240 (240mm Water-cooled system
CRYORIG -
Cryorig C7 Cu (Low-Profile Copper Edition*)

A few other oversized CPU cooling systems I forget off the top of my head but a CPU cooler is a CPU cooler after comparing to the previous 3 models I mentioned.
I almost exclusively am using these amazing "Innovation Cooling Graphite Thermal Pads" as an alternative to thermal paste for my CPU's. They're not cheap but they literally last forever.

NZXT - Sentry Mesh Fan Controller
______________________
POWER SUPPLIES (PSU's) -
BeQuiet 550W Straight Power 11 (GOLD)

EVGA -
750P2 (750W, Platinum)
850P2 (850W, Platinum)
750T2 (750W, TITANIUM - yeah baby, yeah)

ROSEWILL -
Quark 750W Platinum
Quark 650W Platinum

SEASONIC -
Focus 750W Platinum
______________________
STORAGE -
HGST Ultrastar 3TB - 64mb Cache - 7200RPM Sata III (3.5)
4X Samsung 860 EVO 500GB SSD's
2X Team Group L5 LITE 3D 2.5" SSD's 480GB
2X WD 10TB Essential EXT (I'm cool with shucking)
+ 6X various other external HDD's (from 4-8TB) - (Seagate, WD & G-Drives)
______________________

Other accessories worth mentioning -
PCI-E to 4X USB hub-adapter (I have a dozen or so of these - might not be sufficient enough &/or needed but again, 'worth mentioning' in case I somehow ever run out of SATA & USB ports and have extra external USB HDD's. Although, I'm sure there would be better suited components if I get to that point that probably won't cost all that much).
______________________
______________________
______________________
Needless to say, I have at least 1X of everything mentioned above. In most all cases, I have multiples of these items but obviously won't be needing 2X CPU's, Cases, etc...

Naturally, I have GPU's. Specifically;

At least 1X of every. Single. NVIDIA GTX 1070 TI (Yes, I have every variation of the 1070 ti made by MSI, EVGA and Zotac. The only brand I don't have is the Gigabyte line. My partners have terrible experience with those so I didn't even bother. I'm clearly not going to be needing a GPU for this build but again, I'm cool with discussing the idea of a barter if anyone reading this is in the market for one.

I also have some GTX 1080 TI's but those are already spoken for, sorry.

It's my understanding that select CPU's I have on this list are ECC Friendly and AFAIK, only 1 of my MOBO's claims to be ECC Friendly (The ASROCK AB350M PRO4), but for the life of me, I can't find any corresponding forums that confirm this and/or direct me to a listing where I can buy compatible RAM. Just the same, if I go w/ the ASROCK MOBO, that means I'd be using one of the Ryzens. Those are DEF. power hungry little buggers. Not a deal-breaker, just hoping to find something a little more conservative in terms of TDP.


In closing, I don't really need someone to hold my hand with the build part as much as figuring out which motherboard, CPU and RAM to get. Then I'm DEFINITELY going to need some guidance on what OS is best for my desired purpose. If building 2X Rigs makes sense, I'm totally open to that as well...
Rig 1 = EPIC NAS SYSTEM
Rig 2 = EPIC PFSENSE (or the like) DEDICATED FIREWALL

Oh, I almost forgot... The current routers I'm using are...
1X Netgear Nighthawk 6900P (Modem + Router)
1X Netgear Nighthawk X6S (AC 4000 I believe - Router dedicated towards my personal devices - no IoT &/or Guests allowed on this one)
1X TP-Link Archer C5 (Router). Total overkill after implementing the Nighthawks but this old beast somehow has the best range, plus it has 2X USB ports so for now, it's dedicated towards my IoT devices.
---- I also have a few other Wi-Fi routers (Apple Airport Extreme & some inferior Netgear's but I can only allocate so many WiFi Routers to so many WiFi channels w/out pissing off my neighbors) On that note, I have managed to convince my neighbors to let me in their house/WiFi configuration so we all have our hardware locked on specific, non-competing frequencies/channels so everyone's happy. :)


Please spare me the insults as I insulted myself throughout this entire venture. Part of why I did this was because when I was a kid, I used to fantasize about building a 'DREAM PC' but could never afford such. To compensate for this deficiency, I would actually print out the latest and greatest hardware components on a word document, print the lists up & tape to wall (for motivation). I was C++ certified at the age of 14 and built my first PC when I was 7. At the age of 15 I abandoned all hope in the sector and moved on to other aspirations. This entire ordeal was largely based off me finally fulfilling a childhood fantasy. On that note = mission accomplished. Now if I'm actually able to fulfill my desires on this post, I'm definitely going to feel less shitty about blowing so much money on all this stuff over the last couple years.

TIA for assisting in any way possible. Gotta love the internets!


THE END.
:)

EDIT/UPDATE (5 hours after OP) - My inbox is being inundated with various people asking for prices and other reasonable questions about my hardware being up for sale. Not to be redundant but rather to expound on my previous remarks about 'being interested in a bartetrade' with any of you here...

I did say I was going to sell my gear on eBay in the near future, I also said I wanted to trade/barter for anything relative to helping me accomplish my OP's mission(s). I'm not desperate for the $$$ but I'm also not one of those people that likes to rip other people off. That said; I value my time and money invested in this hardware and I'm only willing to unload it all once I've established I have ZERO need for any of it here in my home first. Hence my writing this lengthy thread in an attempt to repurpose at least a grand or two I've already spent.

One of the most commonly asked questions I anticipate receiving from interested bodies is going to be "How hard were you on your hardware?" Contrary to what anyone else would have probably done in my scenario which is say they were light on it whether they were or weren't, I documented my handling of the hardware, and have no problem sharing such documentation with verified, interested buyers (WHEN THE TIME COMES) to offer you guys peace of mind.

I have photo's and video's of the venture from A-Z. I am also obliged to provide (redacted) electricity bill statements where you can correlate my photo's (power draw on each rig), and also accurately deduct the excess power my house consumed with our other household appliances. Even taking into consideration how much (more) I spent in electricity from keeping my house at a constant, cool 70-72F year-round (via my Nest thermostat). Even without the rigs, I keep my AC @ 70 when I'm home and for the last 1.5-2 years, I just so happened to spend 85% of my time here at my house. When I would travel, I'd keep it at 72 for my wife & kids.
Additionally; I had each GPU 'custom' oveunderclocke'd (MSI Afterburner for all GPU's but the EVGA's).*
I doubt everyone reading this is aware so this is for those that don't.... EVGA had the brilliant idea of implementing what they call "ICX technology" in their latest NVIDIA GTX GPU's. The short(est) explanation of this "feature" goes as follows:

EVGA GPU's w/ "ICX 9 & above" have EXTRA HEAT/THERMAL SENSORS. Unlike every other GTX 1070 ti on the market, the one's with this feature actually have each of 2/2 on-board fans connected to individual thermal sensors. Which means - if you were to use the MSI Afterburner program on one of these EVGA's and create a custom fan curve for it, you'd only be able to get 1/2 of the fans to function the way intended. The other fan simply would not engage as the MSI Afterburner software wasn't designed/coded to recognize/ communicate with an added sensor (let alone sensor'S). This, in-turn, would likely result in whoever's using it the unintended way having a GPU defect on them within the first few months I'd imagine... Perhaps if they had the TDP power settings dumbed down as much as I did (60-63%), they might get a year or two out of it since it wouldn't run as near as hot, but I doubt any longer than that since cutting off 50% of the cooling system on one of these can't be ignored too long, surely capacitors would start to blow and who knows what else...
(Warning = RANT) Another interesting side-note about the EVGA's and their "Precision-X" OveUnderclocking software is that it's designed to only recognize 4X GPU's on a single system. For miners, that's just not cool. My favorite builds had 8X and for the motherboards that weren't capable of maintaining stable sessions on 8, I set up with 6X. Only my EVGA Rigs had 3 or 4X GPU's dedicated to a single motherboard. Furthermore, and as stated in an earlier paragraph, (& this is just my opinion) = EVGA SOFTWARE SUCKS! Precision X wasn't friendly with every motherboard/CPU I threw at it and their extension software for the CLC Close-Loop-Cooling/ CPU water-coolers simply didn't work on anything, even integrating into their own Precision-X software. The amount of time it took me to finally find compatible matches with that stuff was beyond maddening. (END RANT).
Which leads me to my other comments on the matter. That's what I had every single 1070 ti set at for TDP = 60-63%. Dropping the power load that much allowed me to bring down (on average) each 1070 ti to a constant 110-115W (mind you, this is only possible w/ "Titanium" rated PSU's, Platinum comes pretty damn close to the Titanium though) while mining Ethereum and was still able to maintain a bottom of 30 MH/s and a ceiling of 32 MH/s. Increasing the TDP to 80, 90, 100% or more only increased my hashrates (yields) negligibly, like 35-36 MH/s TOPS, which also meant each one was not only pulling 160-180W+ (Vs. the aforementioned 115'ish range), it also meant my rigs were creating a significantly greater amount of heat! Fortunately for the GPU's and my own personal habits, I live in South Florida where it's hot as balls typically, last winter was nothing like this one. Increasing my yields by 10-15% didn't justify increasing the heat production in my house by >30%, nor the added electricity costs from subjecting my AC handlers to that much of an extra work-load. For anyone reading this that doesn't know/understand what I'm talking about - after spending no less than 2-3 hours with each. and. every. one. I didn't play with the settings on just one and universally apply the settings to the rest. I found the 'prime' settings and documented them with a label-maker and notepad. Here's the math in a more transparent manner:

*** I NEVER LET MY GPU's BREACH 61C, EVER. Only my 8X GPU rigs saw 60-61 & it was the ones I had in the center of the build (naturally). I have REALLY high power fans (used on BTC ASIC MINERS) that were sucking air from those GPU's which was the only way I was able to obtain such stellar results while mining with them. **\*
Mining at "acceptable" heat temps (not acceptable to me, but most of the internet would disagree = 70C) and overclocking accordingly brings in X amount of yields per unit. =
'Tweaking' (underclocking) the GPU's to my parameters reduced my yield per unit from -10-15%, but it SAVED me well over 30-35% in direct electricity consumption, and an unknown amount of passive electricity consumption via creating approximately 20%+ less heat for my AC handler to combat.

I say all this extra stuff not just for anyone interested in mining with their GPU's, but really to answer (in-depth) the apparent questions you people are asking me in PM's. Something else that should help justify my claims of being so conservative should be the fact I only have/used "Platinum and Titanium" rated PSU's. Heat production, power efficiency and longevity of the hardware were ALWAYS my top priority.* . I truly thought Crypto would continue to gain and/or recover and bounce back faster than it did. If this project had maintained positive income for 12 months+, I'd have expanded one of our sites to also cater to GPU mining on a gnarly scale.

Once I have my NAS (& possibly 2nd rig for the firewall) successfully built, I'll be willing/able to entertain selling you guys some/all of the remaining hardware prior to launching on eBay. If there's something you're specifically looking for that I listed having, feel free to PM me with that/those specific item(s). Don't count on an immediate response but what you can count on is me honoring my word in offering whoever asks first right of refusal when the time comes for me to sell this stuff. Fortunately for me, PM's are time-stamped so that's how I'll gauge everyone's place in line. I hope this extra edit answers most of the questions you guys wanted to have answered and if not, sorry I guess. I'll do my best to bring light to anything I've missed out on after I realize whatever that error was/is. The only way anyone is getting first dibs on my hardware otherwise is if they either offer compelling insight into my original questions, or have something I need to trade w/.

THE END (Round#2)


submitted by Im-Ne-wHere to buildapcforme [link] [comments]

Understanding Crypto Mining | And perhaps a way to mitigate its impact on the PC gaming ecosystem

EDIT: Per the moderation staff, I'm adding in to the header what I'm using to make it easier for prospective miners.
  1. Go to https://www.nicehash.com/
  2. Create a login
  3. Download their software and run it (this used to be "????")
  4. Profit
Once you reach 0.002 BTC (about 7-10 days on my GTX 1060 + i7-7700k), you can transfer your earnings to Coinbase for free, and cash out. CB does have fees for conversion to Fiat (cash) and your percentage goes down with higher amounts. So don't cash out just because you can. Cash out when you have enough to buy something.
Also a note on taxes. I'm going to keep this simple.
Hi folks. I just want to thank those of you in advance who trudge through this post. It's going to be long. I will try to have a TLDR at the end, so just scroll down for the bolded text if you want Cliff's Notes.
Disclaimer: I'm a miner, sort of. I casually mine when I sleep/work, using my existing PC. It doesn't make much. I don't buy hardware for mining. But, I still wanted to post this disclaimer in the interest of fairness.
As we all know, cryptocurrency mining has had a devastating impact on the PC gaming ecosystem. The demand for GPUs for mining has lead to scarce availability and sky high prices for relevant hardware. But even hardware that is less desirable for mining relative to their peers (GTX 1050ti, 1080) has been impacted. Why? Because when gamers can't get the 1060 or 1070 that they desire, they gravitate en masse towards something that their finances will allow them to settle for.
But for all that we know about mining, there's still a LOT of myth and misinformation out there. And I blame this on the bigger miners themselves. They have a few tactics they're using to discourage competition. Now, why would they do this? Simply put, the more coins are mined, the harder the algorithms get. That means the same hardware mines a lower rate of cryptocurrency over time. If the mining rates were to get too low before new hardware (Volta/Navi) could be released, it would cause a massive depression in the cryptocurrency market. Most hardware would become unprofitable, and used GPUs would flood the market. Miners want to retain profitability on current hardware until the next generation hardware is out.
So, what tactics are they engaging in? Silence and manipulation. On the former, the bigger miners don't usually participate and contribute to the community (there are exceptions, and they are greatly appreciated). They're sponges, taking whatever the community provides without returning much to the community. On the latter, they post here, in this very sub occasionally. And they continue to push certain types of myth/misinformation to discourage other users from mining.
And why, of all people, would you discourage gamers from mining? It's because of the competition point mentioned above. If a massive number of gamers entered the cryptocurrency mining market, it could trigger a mining apocalypse. There's an estimated 3-4 million current-gen GPUs being used in 24/7 mining operations by dedicated miners. Now, how many current-gen GPUs are used by gamers? I'd bet at least an equal amount. But what about Maxwell and Kepler? Or all those GCN-based GPUs up through Fiji? Bottom line is that when you factor in all available profitable GPUs, gamers drastically outnumber dedicated miners (yes, Kepler and GCN 1.0 are still profitable, barely). And if a large number of those users started casually mining as I am, the following would occur:
  • difficulty would increase, lower output (profitability) for everyone involved
  • Coin creation would initially accelerate, and with no massive change to the market cap, that means per-coin value drops
  • when you factor in slower coin generation for individual miners, coupled with lower coin value, you get...
  • ROI length increase on GPUs, depressing their values, which would lead to lower prices and higher availability
Oh dear, someone just spilled the beans...
So naturally, misinformation needs to be spread. If dedicated miners can keep the uninformed, well, uninformed, they're less likely to join in. And I've seen variations of the following misinformation spread. Here's the common tropes, and my rebuttal.
Mining on your GPU will cause it to die prematurely.
I really wish we had a Blackblaze-equivalent for GPUs used in data centers. NOTHING punishes a GPU like full-time use in a data center. Not mining, not gaming, and not prosumer usage. And these companies pay thousands per GPU. Clearly, they're getting solid ROI for their use.
But let's talk about mining specifically. For my GTX 1060, I limit power to 80% (96W). Fan speed is at a constant 40% (that's in the same ballpark as your blower-style GPU in desktop usage). Temperature is a constant 75°C. That's gentle. Gaming hurts it more (start/stop on the fan, varying temps, quick rise at the start and fall at the end, varying loads, etc.).
And if GPUs did prematurely die from mining? One miner insisted that I'd never see an ROI on my 1060 (which cost me $240) because it would die before I could earn that amount. Yea, GPUs routinely die before hitting their ROI. That's why miners are buying $200 GPUs today for $500, or $400 GPUs today for $900. Because they don't generate enough to cover their MSRP, let alone their current gouged prices. /s
Common sense would dictate that miners are profitable, or they wouldn't mine. Therefore, GPUs are not dying prematurely. So, don't fall for this one. And yes, I've seen those photos of the 20-card Sapphire RMA. Mining data centers have THOUSANDS of cards. Just do an image search for a GPU mining farm. This is well within typical acceptable defect rates.
Power costs are too high for mining to be profitable.
Warning! Danger Will Robinson! Math ahead!
Where I live, electricity ranges from 9.5 cents per kilowatt hour (kw/hr), to 10.1 cents per kw/hr. Let's round to 10 cents. Power measured at the wall from my surge protector, while mining, shows just under 200W. (That's includes my tower, monitor, speakers, a dedicated NAS, a router, and PSU inefficiency). That also includes mining on both CPU and GPU.
At 200W per hour, that's 5 hours to hit 1kw/hr. That's 5kw/hr per 25 hours, so let's call it 5kw/hr per day. That is $0.50 per day total from that outlet (and most of this stuff would be running anyway). That's not even "over my existing costs," that's just out the door.
Bottom line is that electricity is cheap in many areas. The USA national average is currently ~12 cents per kw/hr (RIP Hawaii, at 33 cents). For most of the developed world, power costs are not prohibitive. Don't fall for this. If unsure, check your rates on your bill, and ask someone who can do math if you can't.
Casually mining isn't profitable
There's a big difference between "profit" and "getting rich." I have no expectations of the latter happening from what I'm doing. But "profit" is very much real. It's not power costs that derail profitability. It's all of the hidden fees. Many mining programs take a cut of your output. And then a cut to transfer to a wallet. And then there's a fee to transfer to an exchange. Oh, did you want to then convert to cash? We can...for a fee!
The trick is in finding outlets that allow you to minimize fees. I give up 2% of my output, transfer to my wallet for free, can transfer to an exchange for free, and don't plan to cash out every time I meet the minimum threshold (higher fees!). I instead plan to cash out at extended set intervals to minimize those fees.
NOTE: I am deliberately not listing the provider(s) that I use, because I don't want to be accused of being associated with them and/or driving business to them. I want this post to be about the big picture. But I will answer questions in the comments, provided the moderation staff here has no objections.
Bottom line is that with a mid-range GPU like mine, and without the benefit of CPU mining (it's just not worth it without a modern Core i7, or Ryzen 5/7), my GPU alone could make me ~$60-$75/mo in profit at current rates. Think of how many months/years you go between upgrades. Now, do the math. Needless to say, I'm now regretting not going bigger up front :)
It's too complicated for a casual miner, so don't bother
The old "go big or go home" saying, and it sort of piggy backs off the last one. And there is some truth in this. If you're going to be a big-time miner, you need mining programs (often dedicated to each algorithm and/or currency), multiple wallets, access to multiple exchanges, etc. It's daunting.
But for the casual, you don't need that. There are multiple providers who offer you a one-stop-shop. I have one login right now. That login gives me my mining software, which switches between multiple algorithms/coins, gives me a wallet, and lets me transfer to an outside wallet/exchange. My second login will be the exchange (something that lets me convert my currency to local cash) when my balance justifies it. Given the recent Robin Hood announcement, I'm biding my time to see what happens. This space is getting competitive (lower fees).
Bottom line, it's easier now than it ever was before. As I told someone else, "Once I finally started, I wanted to kick my own ass for waiting so long."
New GPUs are expensive, but if you just wait, there will be a buttload of cheap, used GPUs for you!
Miners learned from the last crash. There were two types of miners in that crash: those who sold their GPUs at a loss, and those who kept mining and made out like bandits on the upswing. Turns out, cryptocurrency really does mimic the stock market (for now).
We're going to look at Bitcoin (BTC) to explain this. No, miners don't mine BTC. But, BTC is commonly what most coins are exchanged for (it makes up roughly one third of the entire cryptocurrency market). And it's the easiest currency to convert to cash. So, when BTC rises or falls in price, the rest of the market goes with it. That includes all of the coins that GPU miners are actually mining.
In January 2017, when the current mining push started, BTC was worth roughly $900 per coin. It's now worth roughly (as of this post) $12,000 per coin, down from a December high of over $20,000 per coin. So yea, the market "crashed." It's also more than 12x the value it was a year ago, when miners dove in. You think they're going to bail at 12x the value? Son, I've got news for you. This market needs to truly crash and burn for them to bail (and that's where you come in!).
So, there's not going to be a flood of used GPUs from a sudden market crash. Again, they've learned from that mistake. Used GPUs will enter the market when they are no longer profitable for mining, and not before. Dedicated miners have lots of room for expansion. When Volta comes out, they're not selling their Pascal GPUs. They're building new Volta mining rigs alongside the Pascal ones, making money off each of them.
Conclusion/TLDR:
  • Mining is subject to diminishing returns. It gets harder over time on the same hardware.
  • PC gamers joining the market en masse could trigger an apocalypse in terms of difficulty
  • Due to this, it benefits pro miners to spread misinformation to discourage gamers from entering the mining game
  • Casually mining on your existing system is safe, easy, could help you pay for your next upgrade(s), and could also hurt the mining market in general (better availability/pricing on GPUs)
  • No, there's no flood of used Pascal/Polaris/Vega GPUs around the corner, as those are HIGHLY profitable even in a depressed market
Second Conclusion - Why do I (jaykresge) personally care?
Simply put, I'm disgusted by this. I was excited about flipping a few friends from consoles to PC gaming. I'm now seeing a reverse trend. One friend is gaming on an RX 560 waiting for prices to hit sanity. He's running out of patience. Others have bailed.
I view our dormant GPUs as the best weapon against cryptocurrency mining. Destroy it from the inside. It's win-win for most of us. Either we earn enough for more upgrades, or we depress pricing. Something's got to give.
In other words, y'all f*ckers better start mining, because I want Volta to be reasonably priced when it launches so I can get an EVGA x80 Hybrid to go with a G-Sync monitor. And if this doesn't happen, I'm going to be cranky!
Seriously though, thanks for reading. Bear with me as I go over this a few more times for typing/grammar. And I look forward to your comments.
submitted by jaykresge to hardware [link] [comments]

Bitcoin Rhodium Mining Guide

Bitcoin Rhodium Mining Guide
Happy Mining!

All available XRC pools can be found on MiningPoolStats

Bitcoin Rhodium Mining Hardware

Baikal Giant+: 1.6 GH/s
Baikal Quad Cube: 1.2 GH/s
Baikal Giant: 900 MH/s
Baikal Quadruple Mini Miner: 600 MH/s
Baikal Miner Cube: 300 MH/s
Baikal Mini Miner: 150 MH/s

Mining Setup

To mine Bitcoin Rhodium you need to set up an XRC wallet and configure your miner of choice. You can choose between Web wallet, Electrum-XRC or Magnum wallet. To set up a web wallet please visit wallet.bitcoinrh.org. Or download and install Electrum-XRC wallet (recommended) for Windows, Linux and MacOS.
Web wallet: wallet.bitcoinrh.org
Electrum-XRC wallet: electrum.bitcoinrh.org
Magnum wallet: https://magnumwallet.co

Sign up for XRC web wallet if not yet done so

  1. Create an account, with your username, password and secure question.
  2. Sign in and click “Create Wallet”.
  3. Set up a strong transaction password. Make sure you store it securely in a secure password manager of choice.
  4. Copy the seed somewhere safe. It’d be a good idea to write seed on a hardcopy and keep it safe.
  5. Paste it to confirm you got it right.
  6. Grab an address for the mining step. Your wallet is now ready to mine XRC.

Instructions for mining XRC on the official pool

Pool link: poolcore.bitcoinrh.org
  1. Any miner that supports X13 will be able to mine XRC. We have a few examples below of miners that are well tested with Bitcoin Rhodium network.
  2. For any miner, configure the miner to point to:
(0–0.8 GH/s) stratum+tcp://poolcore.bitcoinrh.org:3061
(0.8–2 GH/s) stratum+tcp://poolcore.bitcoinrh.org:3062
(3–4 GH/s) stratum+tcp://poolcore.bitcoinrh.org:3063
(5+ GH/s) stratum+tcp://poolcore.bitcoinrh.org:3064
with your XRC address as username and x as password. You don’t need to open an account on pool. You will be mining to XRC address and mined coins will be transferred to your wallet
after blocks reach 10 block maturity
after you mined up minimal amount of coins (currently 0.1 XRC)
sometimes mined blocks could get rejected by network (orphaned) after they were counted as valid blocks. This is normal network behavior to follow longest chain
  1. http://poolcore.bitcoinrh.org is used to follow your miner and network statistics.

CPU Miner-Multi

Source: https://github.com/tpruvot/cpuminer-multi
Sample configuration with CPU Miner tested on UBUNTU.
{
“url” : “stratum+tcp://poolcore.bitcoinrh.org:3061”, “user” : “YOUR XRC ADDRESS”,
“pass” : “x”,
“algo” : “x13”, “threads” : 1,
“cpu-priority” : 5,
“cpu-affinity” : 1, “benchmark” : false, “debug” : true, “protocol”: true, “show-diff”: true, “quiet” : false
}
Command to run your CPUMiner: cpuminer -c cpuminer.json

SGMiner (ATI GPU)

SGMiner is a GPU-based mine: https://github.com/nicehash/sgminereleases
The configuration below was tested on Windows:
setx GPU_FORCE_64BIT_PTR 0
setx GPU_MAX_HEAP_SIZE 100
setx GPU_USE_SYNC_OBJECTS 1
setx GPU_MAX_ALLOC_PERCENT 100
setx GPU_SINGLE_ALLOC_PERCENT 100
cd C:\Software\sgminer-5.6.1-nicehash-51-windowsamd64 sgminer.exe
— gpu-platform 1 — algorithm x13mod -url stratum+tcp://poolcore.bitcoinrh. org:3062 — pool-user — userpass :x — auto-fan — temp-target 70 — temp-over- heat 82 — temp-cutoff 85 — gpu-fan 65–85 — log-file log.txt — no-adl — no-extra- nonce -P –T

CCMiner (NVIDIA GPU)

CCMiner is a GPU-based miner (NVIDIA)
Command to run your CCMINER:
ccminer-x64.exe -a x13 -o stratum+tcp://poolcore.bitcoinrh.org:3062 -O :without -D — show-diff

Baikal miner

Settings: Url:
(0–2 GH/s) stratum+tcp://poolcore.bitcoinrh.org:3062
(3–4 GH/s) stratum+tcp://poolcore.bitcoinrh.org:3063
(5+ GH/s) stratum+tcp://poolcore.bitcoinrh.org:3064
Algo: x13User: your XRC receiving address (make sure you set 2 distinct addresses for each hashing board)
Pass: x
Extranonce: leave off Priority set to 0 and 1
Once pool stratum address and your wallet as user are set up you should see your miner mining against XRC pool. When miner is working the status column is green. The pool and miner are incorrectly configured now as status says “Dead” highlighted in red.

Instructions for mining XRC on BSOD pool

Pool link: bsod.pw/en/pool/dashboard/XRC/
Use this code for your miner: -a x13 -o stratum+tcp://pool.bsod.pw:2582 -u WALLET.rig
BSOD pool allows both solo and party mining.
For solo mining use code: -a x13 -o stratum+tcp://pool.bsod.pw:2582 -u WALLET.rig -p m=solo And for party mining use: -a x13 -o stratum+tcp://pool.bsod.pw:2582 -u WALLET.rig -p m=party.yourpassword
NOTICE: You can use us for North America and asia for Asia instead of euin your .bat file or config.
You can also use BSOD pool’s monitor app for Android and iOS.

Instructions for mining XRC on ZERGPOOL

Zergpool offers low fees (just 0.5%) and also SOLO and PARTY mining with no extra fees.
To mine XRC on Zergpool use this command lines for your miner:
Regular: -a x13 -o stratum+tcp://x13.mine.zergpool.com:3633 -u -p c=XRC,mc=XRC Solo: -a x13 -o stratum+tcp://x13.mine.zergpool.com:3633 -u -p c=XRC,mc=XRC,m=solo Party: -a x13 -o stratum+tcp://x13.mine.zergpool.com:3633 -u -p c=XRC,mc=XRC,m=party
Use your coin wallet address as username in mining software. Specify c=SYMBOL as password to identify payout wallet coin, and the same coin in mc=SYMBOL to specify mining coin.
For more information and support please visit http://zergpool.com
Notice that when there are more pools mining XRC in different geographic/availability locations choose the nearest to you as lowest priority and then add desirable fall back pool options in different geographic locations or pools. This is useful when one pool experiences issues, to fall back to different pool in Bitcoin Rhodium network.

Calculate your Bitcoin Rhodium mining profitability

WhatToMine: https://whattomine.com/coins/317-xrc-x13
CoinCalculators: https://www.coincalculators.io/coin/bitcoin-rhodium

Feel free to ask questions in Discord community. There are lots of helpful people around the world watching XRC 24x7.

Bitcoin Rhodium Dev Team
submitted by BitcoinRh to BitcoinRhodium [link] [comments]

COSMiC V4.1.3t Update [ nVidia-CUDA | Win64 | GUI-Based ]

Hello everyone! I'm happy to share what I've been working on with you today. This is an update to COSMiC V4 that makes significant improvements to nearly every area of the miner (detailed below.)
Important Note: Now built against CUDA v10.1+Update1, so it is strongly recommended that you update your nVidia graphics drivers. (It is NOT necessary to install the CUDA toolkit)
FEATURES:
CHANGES THIS VERSION:
Download Link:
https://bitbucket.org/LieutenantTofu/cosmic-v3/downloads/COSMiC-v4.1.3t-Win64.zip
Screenshots:
https://imgur.com/a/4LshQ1d
If you have any questions, comments or feedback feel free to leave them here or contact me on the 0xBitcoin Discord (Username: LtTofu#6168)!
submitted by LieutenantTofu to 0xbitcoin [link] [comments]

COSMiC v4.1.3 - the GUI-based, High-Performance Ethereum Token Miner [ ERC-918/0xBTC | nVidia/CUDA | Win64 ]

Hello everyone! I'm happy to share what I've been working on with you today. This is an update to the COSMiC V4 token miner (for nVidia/CUDA devices and Windows 7 or newer, 64-bit). This version makes substantial improvements to nearly every area of the miner (detailed below.)
Requires: 1 or more nVidia (CUDA) GPU, any 64-bit CPU (>2 threads recommended), any 64-bit Windows version (Windows 7 should work, but developed for Windows 8.1 and up.) Remember to install the latest nVidia graphics drivers
Supports Mining: 0xBitcoin, KiwiToken, S.E.D.O., (standalone or merge-mined with 0xBTC), CatEther(0xCATE), Bitcoin Classic Token (on the Ethereum Classic network), LiraPay, CryptoPepes (mining 'rat-race' now over), and many other ERC-918 token varieties!
Important Note: Now built against CUDA v10.1+Update1, so it is strongly recommended that you update your nVidia graphics drivers. (It is NOT necessary to install the CUDA toolkit)
FEATURES:
CHANGES THIS VERSION:
Download Link:
https://bitbucket.org/LieutenantTofu/cosmic-v3/downloads/COSMiC-v4.1.3t-Win64.zip
Screenshots:
https://imgur.com/a/4LshQ1d
If you have any questions, comments or feedback feel free to PM me, comment here, on the 0xBTC subreddit, or the 0xBitcoin Discord ( Username: LtTofu#6168, Invite URL: https://discord.gg/NrYTf8 ).
submitted by LieutenantTofu to gpumining [link] [comments]

Understanding Crypto Mining | And perhaps a way to mitigate its impact on the PC gaming ecosystem

This is a crosspost from /hardware, but I will be editing this independently based on community feedback and guidelines. Prior to posting here, I reached out to your local mod staff to ensure that I wasn't stepping on any toes, given the nature of its content. I hope you find this useful.
Hi folks. I just want to thank those of you in advance who trudge through this post. It's going to be long. I will try to have a TLDR at the end, so just scroll down for the bolded text if you want Cliff's Notes.
Disclaimer: I'm a miner, sort of. I casually mine when I sleep/work, using my existing PC. It doesn't make much. I don't buy hardware for mining. But, I still wanted to post this disclaimer in the interest of fairness.
As we all know, cryptocurrency mining has had a devastating impact on the PC gaming ecosystem. The demand for GPUs for mining has lead to scarce availability and sky high prices for relevant hardware. But even hardware that is less desirable for mining relative to their peers (GTX 1050ti, 1080) has been impacted. Why? Because when gamers can't get the 1060 or 1070 that they desire, they gravitate en masse towards something that their finances will allow them to settle for.
But for all that we know about mining, there's still a LOT of myth and misinformation out there. And I blame this on the bigger miners themselves. They have a few tactics they're using to discourage competition. Now, why would they do this? Simply put, the more coins are mined, the harder the algorithms get. That means the same hardware mines a lower rate of cryptocurrency over time. If the mining rates were to get too low before new hardware (Volta/Navi) could be released, it would cause a massive depression in the cryptocurrency market. Most hardware would become unprofitable, and used GPUs would flood the market. Miners want to retain profitability on current hardware until the next generation hardware is out.
So, what tactics are they engaging in? Silence and manipulation. On the former, the bigger miners don't usually participate and contribute to the community (there are exceptions, and they are greatly appreciated). They're sponges, taking whatever the community provides without returning much to the community. On the latter, they post here, in this very sub occasionally. And they continue to push certain types of myth/misinformation to discourage other users from mining.
And why, of all people, would you discourage gamers from mining? It's because of the competition point mentioned above. If a massive number of gamers entered the cryptocurrency mining market, it could trigger a mining apocalypse. There's an estimated 3-4 million current-gen GPUs being used in 24/7 mining operations by dedicated miners. Now, how many current-gen GPUs are used by gamers? I'd bet at least an equal amount. But what about Maxwell and Kepler? Or all those GCN-based GPUs up through Fiji? Bottom line is that when you factor in all available profitable GPUs, gamers drastically outnumber dedicated miners (yes, Kepler and GCN 1.0 are still profitable, barely). And if a large number of those users started casually mining as I am, the following would occur:
  • difficulty would increase, lower output (profitability) for everyone involved
  • Coin creation would initially accelerate, and with no massive change to the market cap, that means per-coin value drops
  • when you factor in slower coin generation for individual miners, coupled with lower coin value, you get...
  • ROI length increase on GPUs, depressing their values, which would lead to lower prices and higher availability
Oh dear, someone just spilled the beans...
So naturally, misinformation needs to be spread. If dedicated miners can keep the uninformed, well, uninformed, they're less likely to join in. And I've seen variations of the following misinformation spread. Here's the common tropes, and my rebuttal.
Mining on your GPU will cause it to die prematurely.
I really wish we had a Blackblaze-equivalent for GPUs used in data centers. NOTHING punishes a GPU like full-time use in a data center. Not mining, not gaming, and not prosumer usage. And these companies pay thousands per GPU. Clearly, they're getting solid ROI for their use.
But let's talk about mining specifically. For my GTX 1060, I limit power to 80% (96W). Fan speed is at a constant 40% (that's in the same ballpark as your blower-style GPU in desktop usage). Temperature is a constant 75°C. That's gentle. Gaming hurts it more (start/stop on the fan, varying temps, quick rise at the start and fall at the end, varying loads, etc.).
And if GPUs did prematurely die from mining? One miner insisted that I'd never see an ROI on my 1060 (which cost me $240) because it would die before I could earn that amount. Yea, GPUs routinely die before hitting their ROI. That's why miners are buying $200 GPUs today for $500, or $400 GPUs today for $900. Because they don't generate enough to cover their MSRP, let alone their current gouged prices. /s
Common sense would dictate that miners are profitable, or they wouldn't mine. Therefore, GPUs are not dying prematurely. So, don't fall for this one. And yes, I've seen those photos of the 20-card Sapphire RMA. Mining data centers have THOUSANDS of cards. Just do an image search for a GPU mining farm. This is well within typical acceptable defect rates.
Power costs are too high for mining to be profitable.
Warning! Danger Will Robinson! Math ahead!
Where I live, electricity ranges from 9.5 cents per kilowatt hour (kw/hr), to 10.1 cents per kw/hr. Let's round to 10 cents. Power measured at the wall from my surge protector, while mining, shows just under 200W. (That's includes my tower, monitor, speakers, a dedicated NAS, a router, and PSU inefficiency). That also includes mining on both CPU and GPU.
At 200W per hour, that's 5 hours to hit 1kw/hr. That's 5kw/hr per 25 hours, so let's call it 5kw/hr per day. That is $0.50 per day total from that outlet (and most of this stuff would be running anyway). That's not even "over my existing costs," that's just out the door.
Bottom line is that electricity is cheap in many areas. The USA national average is currently ~12 cents per kw/hr (RIP Hawaii, at 33 cents). For most of the developed world, power costs are not prohibitive. Don't fall for this. If unsure, check your rates on your bill, and ask someone who can do math if you can't.
Casually mining isn't profitable
There's a big difference between "profit" and "getting rich." I have no expectations of the latter happening from what I'm doing. But "profit" is very much real. It's not power costs that derail profitability. It's all of the hidden fees. Many mining programs take a cut of your output. And then a cut to transfer to a wallet. And then there's a fee to transfer to an exchange. Oh, did you want to then convert to cash? We can...for a fee!
The trick is in finding outlets that allow you to minimize fees. I give up 2% of my output, transfer to my wallet for free, can transfer to an exchange for free, and don't plan to cash out every time I meet the minimum threshold (higher fees!). I instead plan to cash out at extended set intervals to minimize those fees.
NOTE: I am deliberately not listing the provider(s) that I use, because I don't want to be accused of being associated with them and/or driving business to them. I want this post to be about the big picture. But I will answer questions in the comments, provided the moderation staff here has no objections.
Bottom line is that with a mid-range GPU like mine, and without the benefit of CPU mining (it's just not worth it without a modern Core i7, or Ryzen 5/7), my GPU alone could make me ~$60-$75/mo in profit at current rates. Think of how many months/years you go between upgrades. Now, do the math. Needless to say, I'm now regretting not going bigger up front :)
It's too complicated for a casual miner, so don't bother
The old "go big or go home" saying, and it sort of piggy backs off the last one. And there is some truth in this. If you're going to be a big-time miner, you need mining programs (often dedicated to each algorithm and/or currency), multiple wallets, access to multiple exchanges, etc. It's daunting.
But for the casual, you don't need that. There are multiple providers who offer you a one-stop-shop. I have one login right now. That login gives me my mining software, which switches between multiple algorithms/coins, gives me a wallet, and lets me transfer to an outside wallet/exchange. My second login will be the exchange (something that lets me convert my currency to local cash) when my balance justifies it. Given the recent Robin Hood announcement, I'm biding my time to see what happens. This space is getting competitive (lower fees).
Bottom line, it's easier now than it ever was before. As I told someone else, "Once I finally started, I wanted to kick my own ass for waiting so long."
New GPUs are expensive, but if you just wait, there will be a buttload of cheap, used GPUs for you!
Miners learned from the last crash. There were two types of miners in that crash: those who sold their GPUs at a loss, and those who kept mining and made out like bandits on the upswing. Turns out, cryptocurrency really does mimic the stock market (for now).
We're going to look at Bitcoin (BTC) to explain this. No, miners don't mine BTC. But, BTC is commonly what most coins are exchanged for (it makes up roughly one third of the entire cryptocurrency market). And it's the easiest currency to convert to cash. So, when BTC rises or falls in price, the rest of the market goes with it. That includes all of the coins that GPU miners are actually mining.
In January 2017, when the current mining push started, BTC was worth roughly $900 per coin. It's now worth roughly (as of this post) $12,000 per coin, down from a December high of over $20,000 per coin. So yea, the market "crashed." It's also more than 12x the value it was a year ago, when miners dove in. You think they're going to bail at 12x the value? Son, I've got news for you. This market needs to truly crash and burn for them to bail (and that's where you come in!).
So, there's not going to be a flood of used GPUs from a sudden market crash. Again, they've learned from that mistake. Used GPUs will enter the market when they are no longer profitable for mining, and not before. Dedicated miners have lots of room for expansion. When Volta comes out, they're not selling their Pascal GPUs. They're building new Volta mining rigs alongside the Pascal ones, making money off each of them.
Conclusion/TLDR:
  • Mining is subject to diminishing returns. It gets harder over time on the same hardware.
  • PC gamers joining the market en masse could trigger an apocalypse in terms of difficulty
  • Due to this, it benefits pro miners to spread misinformation to discourage gamers from entering the mining game
  • Casually mining on your existing system is safe, easy, could help you pay for your next upgrade(s), and could also hurt the mining market in general (better availability/pricing on GPUs)
  • No, there's no flood of used Pascal/Polaris/Vega GPUs around the corner, as those are HIGHLY profitable even in a depressed market
Second Conclusion - Why do I (jaykresge) personally care?
Simply put, I'm disgusted by this. I was excited about flipping a few friends from consoles to PC gaming. I'm now seeing a reverse trend. One friend is gaming on an RX 560 waiting for prices to hit sanity. He's running out of patience. Others have bailed.
I view our dormant GPUs as the best weapon against cryptocurrency mining. Destroy it from the inside. It's win-win for most of us. Either we earn enough for more upgrades, or we depress pricing. Something's got to give.
In other words, y'all f*ckers better start mining, because I want Volta to be reasonably priced when it launches so I can get an EVGA x80 Hybrid to go with a G-Sync monitor. And if this doesn't happen, I'm going to be cranky!
Seriously though, thanks for reading.
submitted by jaykresge to pcgaming [link] [comments]

Inexplicable CPU Throttling?

Hey all,

I run a 8600k with GTX 970, MSI Z370 Gaming Plus, 16 GB RAM 3000MHZ, EVGA Supernova G2 750W, m.2 ssd samsung evo 970 pro, and 2 normal WD Blue Sata drives.

So, I have been having an issue that I just cannot explain and I have tried almost everything that I have found - and I found a lot in 3 days - in reputable sites online but still to no avail. My CPU cores have been running for the last year or so on 4.6mhz with XMP enabled, not having messed with any other settings in the (voltage was on auto) and running fine, except for 1 thing. CPU demanding games (BFV, Apex Legends) would underperform, but just enough so I would blame it on my "shitty" GPU (I run 1440p and 970 should be expected to underperform at the newer releases) and not look into it further, cause I am an idiot. This became apparent to me lately where I tried to play MORDHAU (64vs64 mode is pretty demanding on CPU) with some friends, and the CPU throttling is so obvious there that forced me to look into it but also consider that both previous mentioned games also had more or less the same issue.

I will explain exactly what happens that makes me think it is CPU throttling, in order to give you a better understanding:
The game runs (when stable) at 80-100 fps with medium settings give or take then suddenly dips to 5 fps for anywhere between 20 seconds and 2 minutes only to jump up again. It happens completely randomly, to the point where the first time I played it only happened twice in 2 hours and the next time I tried to play it happened non stop and made the game unplayable. I dropped all the settings to the lowest, even went as low as 1200x800 resolution but while fps increases greatly, the dips to 5 fps still happen frequently. Drivers were updated but I also downloaded a recent hotfix update from NVIDIA that wasn't on GeForce Experience and supposedly fixes some CPU usage issues. I transferred the game to my SSD just to see if it was a read issue, nothing, I put all the usually suggested settings - High priority CPU on the game from task manager and tweaked the settings on Geforce Control Panel but still the same. In the time this happens my CPU temp is at 60-70C and does not increase or drop, GPU temp is 65-80C and does not increase or drop, all 6 Core Clocks are stuck at 4600Mhz but CPU usage dips to 25%-35% from 55%-70% while fps dips. Speccy showed mobo at 41C and Voltage on cpu load reaches 1.216 to 1.232 and when not loaded its as low as 1.206, so looks pretty normal. This appeared like CPU throttling, so I thought VRMs overheating might be an issue here and when I removed the tempered glass and touched the VRMs, they were burning hot. I went to bios and changed back to stock clock speeds (with game boost) which takes the Clocks to 4.3-4.4 only for it to happen again. I switched game boost off and tried turbo boost only (4.1 Mhz) and again, nothing. In this case, I even tried running with the case glass open, switched all my air flow to have 2 case fans directly blow air towards the VRMs and an air conditioning unit directly on top of the case blow cold air towards it. The VRMs were not hot in this case but the issue remained the same. The only thing that improved is that the throttling is slightly shorter in duration but happens in (more or less) the same random frequency making the game unplayable (also pc is unusable at this time from stuttering).

I ran HeavyLoad to see if the CPU had an issue but the test turned out fine. I read about how sometimes mobo power limits your CPU so I increased it from 95W and 120W short to 150W and 200W, but again nothing. I also read about how this could be caused by malware (heard about a bitcoin mining virus) so I ran multiple scans with different programs. I am now considering restoring factory BIOS and formatting everything at the same time to see if it gets fixed but I will not try it until later today.

Apologies for the wall of text, but in the last 3 days I have tried everything I could find or think of to fix this issue at hand and still. Does anyone honestly have any idea of what could cause this? Could this be an issue with a faulty hardware part? Or something else I am missing?

Thank you in advance.
submitted by Slike7 to overclocking [link] [comments]

What the fuck is steam VR doing to my computer?

SOLVED Zmann966 has solved it. https://www.reddit.com/oculus/comments/6wspn2/what_the_fuck_is_steam_vr_doing_to_my_computedmaqq6o/
Steams vrserver is creating massive error log files.
More info: https://www.reddit.com/Steam/comments/4sr65e/612gb_text_log_in_steam/d5br6aj/
https://www.reddit.com/oculus/comments/4hcp9j/explanation_of_where_300gb_went/
https://forums.oculus.com/community/discussion/49829/vrserver-txt
https://steamcommunity.com/app/250820/discussions/3/357286663692515618/
https://github.com/ValveSoftware/openvissues/444
Man, fuck steam.
I've been noticing some extremely odd behavior from steam's VR applications to the point that it has me very worried and I can't find any explanation for it in the internet.
So I'm at work the other day and I get a call from my girlfriend telling me Photoshop isn't working because the scratch disks are full. I thought this was weird because I was sure I had over 100gb if space left on my C drive, but hey I have been installing a lot of VR games so it's possible. So I check it out when I get home and find to my horror my C drive, 0 bytes available. Every last byte gone. So I quickly start making some space by deleting old games.
This is where I'm truly shocked.
Every gig I free up disappears in front of my eyes in seconds.
I try closing everything and rebooting several times, no dice. No matter what I do space just keeps being eaten gig by gig. I end up downloading CCleaner which I can hardly believe my eyes tells me "You could free 150,000,000kb by removing steam VR temporary files."
What the fuck.
The program wasn't even running!
After removing it and reinstalling the problem seemed to be solved.
Except last night after playing some robo recall and closing Oculus home (again didn't even start steam) I find after a couple hours my computer is burning hot.
My computer is never hot, so I fire up MSI afterburner to see what's going on only to find my GTX 1080 ti is running full tilt at 80°c with no applications open.
Into the task manager I go, lo and behold steamvrserver is using 50% of my resources even though steam isn't running. I end task it and my GPU temp immediately drops to 40°c.
What the hell is going on?
Edit: Seeing as everyone is so eager to inform me I have a bitcoin mining virus I just want to point out I've scanned my computer several times with Malwarebytes, AVG and windows defender and it's all clean.
Also the space eating and GPU stressing were on separate occasions.
Most likely answer I can see is SteamVR failing to quit and writing enormous log files for days. Hard to verify as I cleared steams temp files when I found out they were 150Gb.
Would also explain the GPU if it was background running several instances of steam VR home that failed to quit.
Thanks for the help, those of you who were helpful.
submitted by Cerpin-Taxt to oculus [link] [comments]

COSMiC Miner v4.1.2 Update (Win64/nVidia | GUI-based | Multi-GPU)

Hello everyone! I'm happy to share what I've been working on with you today. This is an update to COSMiC Miner which adds new features, under-the-hood improvements and (of course) optimizations to the CUDA core for greater efficiency. I call this build "beta" because I've added significant new functionality, but it actually includes refinements for greater stability. With miners' feedback, I'll make any desired improvements for 4.1.3. :)
Important Note: Now built against CUDA v10.0 - It is strongly recommended that you update your nVidia graphics drivers. It is NOT necessary to install the CUDA toolkit.
FEATURES:
CHANGES THIS VERSION:
Screenshots: https://imgur.com/a/L5l5bu5
Download Link: https://bitbucket.org/LieutenantTofu/cosmic-v3/downloads/COSMiC-Miner-v4.1.2b.zip Screenshots:
If you have any questions, comments or feedback feel free to leave them here or contact me on the 0xBitcoin Discord. (Username is LtTofu#6168.)
submitted by LieutenantTofu to 0xbitcoin [link] [comments]

Inexplicable CPU throttling

Hey ,

I run a 8600k with GTX 970, MSI Z370 Gaming Plus, 16 GB RAM 3000MHZ, EVGA Supernova G2 750W, m.2 ssd samsung evo 970 pro, and 2 normal WD Blue Sata drives.

So, I have been having an issue that I just cannot explain and I have tried almost everything that I have found - and I found a lot in 3 days - in reputable sites online but still to no avail. My CPU cores have been running for the last year or so on 4.6mhz with XMP enabled, not having messed with any other settings in the (voltage was on auto) and running fine, except for 1 thing. CPU demanding games (BFV, Apex Legends) would underperform, but just enough so I would blame it on my "shitty" GPU (I run 1440p and 970 should be expected to underperform at the newer releases) and not look into it further, cause I am an idiot. This became apparent to me lately where I tried to play MORDHAU (64vs64 mode is pretty demanding on CPU) with some friends, and the CPU throttling is so obvious there that forced me to look into it but also consider that both previous mentioned games also had more or less the same issue.

I will explain exactly what happens that makes me think it is CPU throttling, in order to give you a better understanding:
The game runs (when stable) at 80-100 fps with medium settings give or take then suddenly dips to 5 fps for anywhere between 20 seconds and 2 minutes only to jump up again. It happens completely randomly, to the point where the first time I played it only happened twice in 2 hours and the next time I tried to play it happened non stop and made the game unplayable. I dropped all the settings to the lowest, even went as low as 1200x800 resolution but while fps increases greatly, the dips to 5 fps still happen frequently. Drivers were updated but I also downloaded a recent hotfix update from NVIDIA that wasn't on GeForce Experience and supposedly fixes some CPU usage issues. I transferred the game to my SSD just to see if it was a read issue, nothing, I put all the usually suggested settings - High priority CPU on the game from task manager and tweaked the settings on Geforce Control Panel but still the same. In the time this happens my CPU temp is at 60-70C and does not increase or drop, GPU temp is 65-80C and does not increase or drop, all 6 Core Clocks are stuck at 4600Mhz but CPU usage dips to 25%-35% from 55%-70% while fps dips. Speccy showed mobo at 41C and Voltage on cpu load reaches 1.216 to 1.232 and when not loaded its as low as 1.206, so looks pretty normal. This appeared like CPU throttling, so I thought VRMs overheating might be an issue here and when I removed the tempered glass and touched the VRMs, they were burning hot. I went to bios and changed back to stock clock speeds (with game boost) which takes the Clocks to 4.3-4.4 only for it to happen again. I switched game boost off and tried turbo boost only (4.1 Mhz) and again, nothing. In this case, I even tried running with the case glass open, switched all my air flow to have 2 case fans directly blow air towards the VRMs and an air conditioning unit directly on top of the case blow cold air towards it. The VRMs were not hot in this case but the issue remained the same. The only thing that improved is that the throttling is slightly shorter in duration but happens in (more or less) the same random frequency making the game unplayable (also pc is unusable at this time from stuttering).

I ran HeavyLoad to see if the CPU had an issue but the test turned out fine. I read about how sometimes mobo power limits your CPU so I increased it from 95W and 120W short to 150W and 200W, but again nothing. I also read about how this could be caused by malware (heard about a bitcoin mining virus) so I ran multiple scans with different programs. I am now considering restoring factory BIOS and formatting everything at the same time to see if it gets fixed but I will not try it until later today.

Apologies for the wall of text, but in the last 3 days I have tried everything I could find or think of to fix this issue at hand and still. Does anyone honestly have any idea of what could cause this? Could this be an issue with a faulty hardware part? Or something else I am missing?

Thank you in advance.
submitted by Slike7 to techsupport [link] [comments]

PC crash while gaming, can't reboot unless PSU turned off and on

Title, 2700x and gtx 1070, 550 bronze rated. Reseated cpu with h55, idles at ~40C. PC will shut down in game, power button will not start it unless I turn off the PSU and t turn it back on. Bitcoin mining works fine, no problems with GPU at 100%. Temps look fine when gaming, not exceeding 75C, until crash
submitted by Jusrat to buildapc [link] [comments]

Tips for improving 1080 Ti performance?

Hi there! So I've had my 1080 Ti for almost a year now, and it seems like it isn't performing as well as it should.
In a game like Overwatch, I'm getting 130ish FPS even on low settings, no V-sync or anything, at 1440p. I see that benchmarks that people have posted are about 200FPS with max graphics settings at 1440p.
In FFXV, I've struggled to maintain above 60FPS with max settings in some areas, when I should be getting 80-120...
My specs are as follows:
Intel i7-8700K currently at stock, cooled by a Corsair Hydro H100i v2
GTX 1080 Ti, liquid cooled and at stock settings
16 GB DDR4 2400 RAM
3 SSDs; 1 NVMe boot drive, a 1 TB SATA M.2 SSD, and a 250 GB SATA SSD
1000W power supply
144hz gaming monitor
I've noticed that my GPU Temps have gone up lately; it used to idle at around 25°C, but now idles at 37-40°C.
I currently have my Windows and Nvidia power settings set to 'prefer maximum performance.'
At one point in time, I tried overclocking my GPU - and at one point I unlocked the voltage control and may have gone a bit too high. But I hadn't set it like that for very long, and I simply stuck with a modest overclock. But I noticed a few days ago that if I adjusted the overclock, my screen would static for less than a second before returning to normal. I'm guessing this isn't normal? I'm not currently overclocked.
As another note, I noticed the increase in idle temps long before the overclocking.
And finally, at one point I was briefly infected with a bitcoin mining program - I say briefly because it was stopped within seconds by my anti-virus. I've wiped my computer quite a few times so I doubt it's still there, but I felt I should mention it.
submitted by BaakiBree to computers [link] [comments]

Bitcoin Private mining pool https://www.bitpoolmining.com

Bitcoin Private mining pool https://www.bitpoolmining.com
 
Hello fellow BTCP nerds. We are a group of 4 software developers that are all miners. We decided to use our development background to start a pool and slowly add features that we wanted to see in other pools. Any feedback or suggestions would be greatly appreciated. We have also built an open source GPU mining/monitoring software app that we call BPM that is also BTCP compatible https://bitpoolmining.com/bpm. Our goal has been to build cool software and to help the community whenever we can. We set out to build software for 'one click mining' that can help ease setup for BTCP mining for new/small miners and also provide tools powerful enough for mining farms.
 
Some neat pics:
 
Pool details:
 
 
BPM GPU monitor app details:
 
 
BTCP Connection Info:
 
Stratum Server:  
 us-east.btcp.bitpoolmining.com 
Ports:  
3060 – GPU MINERS (Vardiff enabled) 3061 – ASIC MINERS (Vardiff enabled) 3062 – Nicehash (Vardiff enabled with min difficulty) 
  Username: YourBTCPAddress.YourWorkerName  
 Example: b1Rwkc2PYb66DJ7bznqFjJdKcxPMbG6FbS8.RIG01 
Password: Use "x"  
 **Password is ignored by BitPoolMining Servers** 
 
Chat with us on Discord https://discord.gg/gxvyJuA
edit: full disclosure, I am one of the pool owners, developer and operator.
 
double ninja edit: If this is the wrong place to post, please let me know, I don't want to be spammy.
submitted by salty_miner to BTC_Private [link] [comments]

My Experience: From FX-8350 to R7-1700

Upgrading from an FX-8350 to a R7-1700.
Just a bit about me – I have been building computers since the mid 80’s. I missed the 8-inch floppy disk era, but came on board when dual 5.25” was considered mainstream and a 10-megabyte full-height HDD was the mark of a power user. The first computer I built for my own enjoyment was an AMD X5-133 (a factory overclocked 486 faster than the Pentium-75), and I’ve used a wide variety of systems since then, including a Pentium Pro-200 which served me well in college and a K6-2 which I took to quite a few LAN parties. While I’ve always had Intel notebooks, my PC’s have been AMD for quite some time now. I decided to upgrade my current main machine, which is an FX-8350 with a mild 4.4Ghz overclock. I was using 2x8GB Crucial Ballistix DDR3-1600 and a Sapphire Radeon Fury Nitro. While I know the R5-1600x would be a better bet for a pure gaming build, I have a soft spot for 8-core machines. I had been tempted to pull the trigger on an i7-7700k for a while, but the timing never worked out. But when I found the R7-1700 at a deep discount and an X370 motherboard on the shelf next to it – I couldn’t resist the siren call of a new build.
Here are my thoughts about the process:
AM4 is physically the same as AM3 from a build perspective, except for the mounting holes. I don’t know what was so important about making the holes have different offsets, but this makes it much more difficult to get quality cooling. Not all manufacturers have brackets yet, and I’m still waiting on Cooler Master to release the brackets for my Siedon 240.
The new motherboard feels very different from my AM3 board. My FX-8350 sat on an ASUS M5A99FX Pro R2.0. It was, for lack of a better word, a very workstation-ish board. 4 PCIx16 slots, 10x USB ports (2 of the USB 3.0), triple USB 2.0 front panel headers (and a USB 3.0 front panel header as well), eSATA on the rear panel, beefy VRM and Northbridge cooling, Toslink output for audio, and so on. The board itself is full of tiny components, support chips, and ports. Granted, many of these connectors are outdated (eSATA and USB2.0), and the PCIe is only 2.0 instead of current-gen 3.0, but there is a LOT of connectivity. Few people paired an FX chip with triple of quad-GPU for gaming, but I know a fair number of people used these for bitcoin mining back before there was widespread ASIC support and back then GPU mining was the most cost-effective way to mint cryptocurrency. Extra PCIe slots could be used for dedicated video capture, PCI-based storage, a RAID card, etc... Having 4 full-size slots allows this kind of flexibility. The new motherboard is an Asrock Fatal1ty x370 Gaming K4. It does not feel very workstation-ish at all. It has only two 16x PCIe slots (and when they are both in use they are only 8x), 8 USB ports on the rear panel, and a much less “busy” motherboard. Very few support chips litter its surface. Instead of a workstation component, it feels much more like a luxury consumer product. This is not a bad thing – just something I noticed while building the system. The rear IO shield is red and black to match its gaming aesthetic, it includes things like premium audio (including a very nice headphone amplifier for the front panel connectors), and while it only has 8x USB ports on the back, 6 of them are USB 3.0 and two of them (including a type-C connector) are USB 3.1 gen2. It includes RGB LED’s under the chipset heatsink and three separate RGB LED controller ports (one of which is used for the boxed cooler), Intel gigabit Ethernet, and dual M.2 slots (one of which connected directly to the CPU). It is very different in “feel” from the older ASUS board, even down to things like a shroud for the external connectors and metal-reinforced PCI slots. I must say, its more aggressive appearance and near-empty areas appeal to me. It does, however, funnel the builder into a particular configuration: limited fast storage through the M.2 slots, slow(er) storage through the 6x SATA ports, all external devices should be USB 3. Personally, these limitations didn’t restrict me for this build, since that was how I was going to set it up anyway, but the fewer connectivity choices might cause some pause for others. The only thing I don’t like about this board is the 20 second POST times. 20 seconds every time. Resuming from sleep is very fast, just reboots are slow. That’s really it. I have no substantive complaints other than that – well, and the memory speed limitations – more on that below.
The Wraith Spire cooler is without doubt the best looking box cooler I’ve ever seen. The symmetrical cylinder look, combined with the LED logo and RGB ring are very striking. I can see why many people have asked to order one, though I think for the 1700X and 1800X they are better off without it. I’ll explain why further down.
Initial hardware setup was very easy. I was able to flash to the newest 2.0 BIOS without any hassle using a DOS USB flash boot drive. The 2.0 BIOS has the newest AGESA code from AMD, as well as support for the R5 processors and better DDR4 compatibility. I didn’t want to cheap out on RAM since apparently Ryzen is sensitive to DDR4 speeds for the latency between cores. I bought the cheapest 16GB DDR4-3200 kit I could find (the EVGA SuperSC 2x8GB), for which I paid $115. While I was not able to get it to boot at 3200, I could get 2933 simply by activating XMP, then manually changing the speed from 3200 to 3000. I then tested it with MemTest86 for two complete cycles, which it passed without errors. I have encountered zero memory issues with these RAM sticks running at 2933. Since this motherboard does not officially support DDR4-3200 at all, I figure this is a good outcome. I am curious to know whether anyone has gotten 3200 on this board – that is, whether the lack of 3200 memory on Asrock’s QVL is a marketing issue or an actual hardware limitation – but I didn’t want to spend nearly double that amount in order to get AM4 verified memory (G.Skill’s FlareX), and 2966 seemed fast enough from the benchmark results I had read.
My old setup had a Samsung 850 EVO 256gb SATA6 drive as the primary boot/gaming drive. It seemed plenty fast but it had become too small for my needs, so this seemed like a good opportunity to buy a new SSD. I originally thought the NVMe drives would be out of my price range, but I bought the Intel 600p 512GB drive for only $10 more than I would have paid for a premium SATA6 drive. Though the 600p is without doubt the SLOWEST NVMe drive out there, it has 3x the read speed as the SATA6 drives, and most of what I am doing with it is trying to get quicker load times. If I was using it for professional workloads (as a video editing scratch drive, for example), I would need much higher sustained write speeds and then Samsung would be the obvious answer. I just didn’t want to spend an extra $80 on write performance that I’d never notice, and the 600p has been an excellent boot/gaming drive.
Ok, back to the Wraith Spire. I tend to have bad luck with the silicon lottery. My FX-8350 was not able to be stable above 4.4Ghz with reasonable temperatures. I was hoping I would be able to get better results from the R7-1700, since general reports indicated that it overclocked well. Unfortunately, it is difficult to tell how good of an overclock I am getting since I can find no good information about maximum recommended temperatures for this chip. Some people say 75c is the maximum safe temp. Others say 75c is a fine everyday 24/7 temp. Others say they are running it at 80c all the time without any issues at all. Steve at Techspot was getting 88c and 90c when overclocking the 1600X and 1500X using the stock coolers and without any instability – were those dangerous temps or totally fine? Nobody seems to know. I like my overclocks to be set-and-forget. I want to get it dialed in and then leave it for years without worrying that it will burn up or degrade or that in this or that application I have to turn back to stock speeds because of the thermals. Since I don’t know what max safe thermals are, I just have to guess based on stock thermals.
For stock speeds, the Wraith Spire does a good job. It is very quiet, and after a few BIOS fan-curve tweaks, it keeps the chip around 35-38 at idle, and around 68-70 on Prime95 (Small FFT, for maximum temperature generation). Incidentally, it also hits 70 if I run Cinebench a bunch of times in a row as well, so I don’t consider the Small FFT test to be totally unrealistic for the load this chip might encounter. From what I can tell, these are good normal temps. I can get 3.5Ghz by simply changing the multiplier and leaving the voltage at stock. This gives Cinebench numbers around the 1550 mark (roughly 6900k levels). Prime95 shows a modest boost in temperatures of 3-4 degrees C, and was stable even for several hours. If I push it to 3.6Ghz at stock voltage the system is unstable. At 3.7Ghz (the 1700’s boost speed for single-threaded loads) it is stable only if I give it 1.3v. While that is a totally fine voltage (AMD recommends up to 1.35v for 24/7), the Wraith Spire cannot handle a Prime95 Small FFT load anymore. I shut down the test and reverted the OC when the CPU read 89c. Given the fact that the Spire was meant to cool a 65w chip (and so probably is rated at no more than 85-95w), this is not a terribly surprising temperature – I wish I knew if it was dangerous. I have no doubt that a 240mm radiator or even a decent tower cooler will be more than enough to cool down my 3.7Ghz R7-1700. I am a little jealous of the people who just set the multiplier to 3700 and are good to go – lower voltages probably mean the Spire would be enough. But for me, it was not to be. I was halfway tempted to see at what temperature the chip would reduce its clock speed, but I didn’t want to burn up a chip I had just bought – might as well wait until I get bigger and better cooling to OC it to the 3.8-3.9 I hope it will reach.
Other than the OC temps it has been smooth sailing. Gaming feels more fluid than with the FX, even in games that I always thought were GPU-limited and/or running at 60fps with VSYNC on. Especially games that are sensitive to single-core performance (Heroes of the Storm is my latest addiction) there is a definite boost in 1% low and 0.1% low FPS. I have been using the Ryzen Balanced power plan from AMD and it seems to do a fantastic job keeping temps low when idle and letting the cores ramp up really fast when needed. I need to test whether the lack of core parking prevents it from hitting the 3.7Ghz boost as much as the regular Balanced plan allows. I think a simple CineBench single-thread comparison will do the trick.
I also tried streaming a bit – and it was able to generate 1080p60fps at x264-medium settings without being noticeable while in game. Later I edited some video of my kids – the final render speed was SOOOO fast. I am, on the whole, very happy with my upgrade. I get better single-core performance, much much better multi-core performance, along with faster disk speeds, and a more modern platform (with RGB lighting, M.2, USB 3.1, etc…).
Now if only I could find out appropriate temperatures…..
submitted by Morphon to Amd [link] [comments]

New to the mining game and Nicehash, advice please

Hey guys, I'm very new to the game of coin mining and selling hashing power (about a month in so far). I ran [email protected] for years on my previous GPU's and CPU's and I think that's a great cause and use of otherwise idle hardware. With that said, I also like money :) I'm looking for some help in optimizing my system, managing transactions and understanding tax implications.
Background/Specs: -GTX 1080 Ti (EVGA SC2 Hybrid, OC'd a bit, max temp 52C) -Intel 6700k @ 4.5GHz -NiceHash Miner v1.7.5.13 (I tried V2 in Alpha a few weeks ago and had issues) -$0.056 per kWh -Max 500 Watt draw at the wall using kill-a-watt (includes monitor, router, printer, etc.) -Currently using Coinbase as my wallet
When I started a few weeks ago, I was making nearly $9 per day... I know this is due to the Bitcoin price swings and uncertainty of crypto in general recently. I use my PC for gaming and other stuff probably around 6 hours per week and mine the rest of the time. So far I've made about .1 BTC ($260) which I certainly can't complain about.
My CPU is running about 200 H/s on CryptoNight. I tried tweaking a bit per https://github.com/nicehash/NiceHashMinewiki/Notes-and-hints-on-CPU-mining by enabling locked pages but didn't see any bump in performance. I'm only seeing 40% - 60% utilization on cores/threads. As I'm typing this I'm pulling down about $0.39 per day. Any other idea for bumping performance here? My 1080ti is pulling around $4 ~ $6 per ($5.25 on Lbry as I type this). Are these #'s reasonable, anything I should be doing settings wise?
Finally, taxes. I have another job... assume I should be reporting these NiceHash BTC deposits as income. I see Coinbase has a transaction history which shoes my BTC deposits and $ value at the time. What can I write off? Only my GPU was purchased this year, the rest of the hardware was already in place. My PC hardware assets, power, internet, office space, etc. are being used 162 out of 168 hours per week (96%).
Thanks!
Edit: corrected weekly usage
submitted by uniwarking to NiceHash [link] [comments]

How To Mine 1 Bitcoin in 10 Minutes - Blockchain BTC Miner ... Should YOU be GPU MINING Cryptocurrency in 2020?! - YouTube How to Mine BitCoin with CPU/GPU (Still Profitable 2020 ... Legit Free Bitcoin Mining CPU+gpu  Free Bitcoin Miner ... Updated 2018 How to mine Bitcoin with GPU Video Card ...

Yikes, I was wrong. Your GPU’s temperature is more important than you think, here’s what you need to know, to stay safe. When mining, you’re pushing your graphics card to work as hard as it can(and extra if you are overclocking). These are consumer products(not industrial ASIC mining hardware), and are really only made for playing video ... Buy the best and latest bitcoin miners mining gpu on banggood.com offer the quality bitcoin miners mining gpu on sale with worldwide free shipping. The Bitcoin.com mining pool has the lowest share reject rate (0.15%) we've ever seen. Other pools have over 0.30% rejected shares. Furthermore, the Bitcoin.com pool has a super responsive and reliable support team. If a mining GPU is an investment, this one requires a bit of a starting capital. However, it also brings in good returns — durability, sheer power, and cost-effectiveness. The memory in this unit is nothing short of incredible: you’re getting 24GB of GDDR6 memory that runs at 14 gigabits per second. This adds up to 672 GB/s of memory bandwidth. Combined with the 1770 MHz boost clock ... Best mining GPU 2020: the best graphics cards for mining Bitcoin, Ethereum and more By Matt Hanson , Michelle Rae Uy 18 August 2020 Join the cryptocurrency craze with the best mining GPUs

[index] [1239] [12705] [29515] [102] [38024] [40586] [11575] [1905] [28612] [19058]

How To Mine 1 Bitcoin in 10 Minutes - Blockchain BTC Miner ...

Discover my UPDATED LIST of the BEST CPU MINEABLE COINS for 2020 along with my SLEEPER PICK! Subscribe for more awesome videos and a chance at Free Bitcoin! ... In this video I show you how to use Awesome Miner and Mining Pool Hub to mine crypto currency (ZClassic & Electroneum) with both a GPU and a CPU on the same ... If you want to exchange your bitcoins for other crypto: (Ethereum, XRB, Litecoin) this is a handy exchange: https://www.kucoin.com/#/?r=256xv I have a video ... https://bit.ly/2uehZf3 https://bit.ly/2BjoUa7 https://bit.ly/2P1tNsD https://bit.ly/2UaAmMb bitcoin sell or buy contact us https://t.me/onlinetrust https://f... Link to website with program download: https://minergate.com/a/26081173111db5c10cd7fa1f Trade Crypto: https://hitbtc.com/?ref_id=5a67efbb8dd92 Link to AMD GP...

#