I should be getting up to 50 GB job size by tomorrow afternoon. The last job I ran was 32 GB in size and that completed without issue. I'm trying to see if it in fact fails at a certain job size. Backups are being saved to both disk and cloud. I have since been running a sequence of test backups of patch / install files, starting with a job size of 15 GB, and increasing each successive test job in data size by c. I have a hard time believing that to be the issue and think he's using that to avoid digging further into what is causing the cloud backup failures. In my last call to Carbonite and my conv with Escalations, the tech I spoke to said that the reason for the ZWCController.exe errors with the Visual C++ may be that the C++ library is failing when the job size gets too big. 10 G of data, backed it up to disk and cloud, and that worked OK. I ran one of the main jobs we normally run - a documents folder store called Programs with c. I then ran a couple of small jobs (up to 10 GB) to both disk / local USB and cloud without issues. 8 GB to cloud only from one of our main document stores succeeded. I decided to see if backing up to to USB disk first and then to cloud was an issue, so I began to test jobs being backed up to the cloud only. The disk backup completed OK, but the cloud backup failed at c. removal and reinstall of the Visual 2013 C ++ library - updated the Visual 2013 C++ library to release/update 5 - I ran a full disk + cloud backup of a primary data folder store of c. I did not import the backup sets, which are normally saved in the cloud. As part of my diagnostics I have done: - a full removal and reinstall of Carbonite and added a couple of jobs manually. We are in fact running only full backups. Any thoughts on how to approach / suggestions on troubleshooting? Appreciate any experience on this and/or suggestions in advance. I have another client running the same build of Carbonite and Visual C++ 2013 and not having any issues. call Microsoft - even though some jobs are succeeding. Carbonite tech support is saying it's a Visual 2013 C++ issue - i.e. I can run a smaller backup job - 10 GB - to both local and cloud backup, but as mentioned when running anything a few GB's bigger than that and larger, the upload fails and throws the above error in the application log. I have manually removed and reinstalled this library with the latest version / Rel 5 over the weekend, removed and reinstalled Carbonite, and the errors persist. 15 GB and larger with the following information: Faulting application name: ZWCController.exe, version: 6.0.0.0, time stamp: 0x5f0317b2 Faulting module name: MSVCR120.dll, version: 9.5, time stamp: 0x56bc00d3 Faulting application path: C:\Program Files\Carbonite\Carbonite Safe Server Backup(圆4)\bin\ZWCController.exe Faulting module path: C:\Windows\SYSTEM32\MSVCR120.dll The MSVCR120.dll file is a Visual C++ library file. I have updated Carbonite to latest version, but am getting a regular error in the Application Log when doing any cloud backups of c. Most cloud backup uploads failed to complete. Since that time Carbonite has never worked properly. Internet service was finally restored through a new provider in August. Obviously for security purposes the server was not connecting to wifi during this period. I had user PC'ss connecting to the internet using wifi adapters on their PC's and a Verizon MyFi as a local hotspot. The internet provider for the client went belly up a few months ago, and the office lost internet service. Jobs were configured to first back up data to a local USB drive, and then Carbonite would take the resulting local compressed backup data file and upload that to Carbonite's cloud storage. For several years we have had Carbonite cloud backup working without any problems. Greetings, I am supporting a physical Windows 2012 server at a client.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |