Sync.com seemed like a good alternative for sharing files AND backup AND version control. With any new system, someone tries to breaking (inadvertantly usually) and one needs to recover. Someone moved (copied actually) a sync SHARED folder to the desktop. There also was the same directory stuck in time on the proper spot – so it looked like the sync broke. However, the one on the desktop kept syncing and that was being kept up to date.
There is hiden .syncinfo (not sure exactly what it is called) in each folder being synced. Even if you turn on hidden files – it cannot be seen. Continue reading
Well, it seems after these last ransomware attacks (which I have been a part of the reconstruction crew) I have learnend about other methods of connecting to remote systems. But RDP is a pretty lightweight protocol for remote connection – it works and it works well except for its vulnerability last quarter. So … if we limit HOW we get to the point that RDP is available (VPN w. certs, username, SSL, limiting firewall scope) then we can still use RDP.
But how to we secure RDP down even further? Ideally I would like it where if the certificates match – boom your in only from certain machines. SSH works like this on Unix. Here are some links to peruse that might answer this question. I will augment this article once I have cracked what I wish to accomplish (this sentence will be removed). Oh – and please do not email certificates or passwords. HUGE pet peave of mine when websites or people do this.
I previously blogged about exporting files and metadata from Sharepoint. It worked!!!! Well, now that all the files are extracted – I have ended up with a csv.
- seems some double quotes are included – but a proper csv parser (eg. Excel) will turn those into 1 single quote
- The Xml column contains an xml node
- Add an xml header and a root node
- <?xml version=”1.0″ encoding=”UTF-8″ standalone=”no” ?>
- <z:row …..
- Then using Notepad++ with “XML Tools plugin installed” – you can surf the node path with ctrl-alt-shift-P
- turns out to be /root/z:row
- a c# app – try this article
- Also – use the Plugins->xml tools->Pretty Print Attributes
- An XPath /root/z:row[@ows_MetaInfo] would get what we require
- Parse this crazy thing with CRLF or
- The first number 1234;# is the id number – remove it
- field name
- TY = type
- val = value
- Use a regex like this to parse
- Now .. undoing the whole thing by hand
- undo the HTML Entities – now this will likely be done with the XML API
- use the Notpad++ plugin HTML Tag (Plugin->Html Tag->Decode Enitites)
- < -> < etc.
- unencode entities like #x0020 to a space etc.
- To extract thumbnails – they are stored in Base64 – here is a c# app to decode into jpegs
When you create a website with basic auth there is an unfortunate consequence in that there is no built-in method for logging out the user. Since basic auth username and password are stored on the client side, the server actually has no power to remove what is stored in the user browser’s cache. The solution? Create a log out screen where the user is prompted to put in the wrong password. This is usually counter-intuitive to most log out systems but is unfortunately the only way to get the client’s own browser to replace what is in the basic auth cache. A good example for how to set up the code in order to make this happen can be found at the following website: http://php.net/manual/en/features.http-auth.php.
In order to set this up for customers to use automatic file transfer via webDAV I found a really good website that seems to provide the perfect method except you skip the last step where you link it to the particular website. Here is the link. This allows the user to automatically update files on the server.
Then I had to enable some sort of authentication (I used Basic Auth as it was one user). This website filled this in (this guy at Microsoft is stellar – he is a teaching kind of guy) Continue reading
Sometimes when you are writing a program in visual studios you suddenly come to the realization that one of the folders you are using in your program really doesnt make sense anymore. Maybe you changed what it does or it has evolved in purpose or maybe the original creator simply gave it a bad name. Well there is a way to change the project folder name at any time. How this is done is one finds the location of the project in explorer. Simply change the specific project folder name with the rename tool and then launch microsoft visual studio. Visual studio, upon launching, will inform you that the particular project you are loading can not be loaded properly. This is because the file path that it was using now leads nowhere. The solution is to click on the particular project, not the solution, and click properties. find the path. Since this VS detects that there is an issue with the path it will now allow you to manipulate it. Change the part of the path that has the old name and replace it with the new one. Now the project folder name is fixed and all you have to do is reload the particular project and it will be ready to use!
What I have learned about SMTP. I have created a SMTP credentials on aws and I have assigned the main company email to that source. It however only has three locations, west US, east US, and EU. Since we live in Canada I figured US west would be the best one. This means that the host is email-smtp.us-west-2.amazonaws.com. The aws system requires a few things to run properly. Firstly it needs to use port 587 as Amazon cuts off all communication through email on port 25. Second, it requires the use of Transport Layer Security. Now there is also the filemaker side. in filemaker server admin one can go to general settings and under email notifications one can find the SMTP information. Here One can fill in all of the information that Amazon requires. I used the Amazon host as the SMTP server address, filled in 587 in the port, tried both forms of SMTP Authentication, checked the use transport layer security and filled in both the credentials for reaching the host, or endpoint then I hit the test SMTP settings… button. This unfortunately always said the same thing: SMTP test failed, check your settings or email. IN the activity log this recorded as email notification to administrators failed:1506. When I looked this up on the internet the best answer i could find indicated that this meant that something was going wrong externally from file maker and this represented a generic error that the email did not go through. So far the emails do not work despite seemingly having all the proper information.
The following website is a step by step method of putting the SSL server certificate files in your website and then setting up the domain:
From the looks and current state of our website, we need to add the option so that we can use the “Setup a SSL certificate to work with your site” option within SSL/TLS Manager. It also appears as though one requires a Certificate that we must either make ourselves our obtain.
James’ updated instructions for dealing with a new website with IIS
- it is possible that IIS manager may create the pool application for you; if it does it is possible that it may create this in version 2.0 of .NetFramework. Instead make this version 4
– install IIS
– make a dir c:\inetpub/wwwroot/fabchoice/showSsImages
– the website root is c:\inetpub/wwwroot/fabchoice and the app is showSsImages now slide this into root.
– change default website to port 81 to make room for our new service
– enable directory browsing in iis manager. This can allow you to test, but this has to be disabled later.
– if you suspect someone is already using port 80 – you can make OUR items appear on port 81 then
– make new website Fabchoice
– add an app of the showSsImages
– make an app pool ‘fabchoice’ and have run as appoolidentity
– set and recurse permissions for IIS_IUSRS and NETWORK SERVICE for the files including and beneath
ii) your SsData directory (often this is c:\SsData – but it could be anywhere depending on your client)
iii) full permissions for this user on the webroot\showSsImages\SSImageCache
– ensure that the SS large data store has IIS_IUSRS and NETWORK SERVICE as full write permissions – most likely on need read
– install IIS – might need to reinstall it and re-register ASPX 4.0 it… as per
– JUST REBOOT – NO QUESTIONS ASKED…
– install viewer
– SS password for sa is Sunrise1 normally or Slabsmith1 is an alternative
– at Blasius – I had thFabchoiceis error “
Handler “PageHandlerFactory-Integrated” has a bad module “ManagedPipelineHandler” in its module list
” and this fixed it http://stackoverflow.com/questions/6846544/how-to-fix-handler-pagehandlerfactory-integrated-has-a-bad-module-managedpip
At Blasius – there were no locations. So – until this is addressed – we need to add one time and click ‘apply’. Call it test
So, moving off to the cloud for some clients. Here is an article that helps “roll your own” backup of AWS servers using AMI (AWS Machine Images). It deletes old backups etc. What I really like about the article is it tells us that we have to understand the manual process first and it leads one through two different scenarios of a snapshot backup and an AMI backup BEFORE instructing one to install the CLI (command line interface) and roll your own unix scripts. Articles follow. Know how each affects costs – EBS? S3? Your own? How much your ISP will charge for transferring and how long it will take etc. etc. Continue reading