You are currently browsing posts tagged “advances”

SEO Principles

May 24, 2014 // Posted in General, Main, Tips and Tricks (Tags: , , , , , , , , , , , , , , , , , , , , , , , , ) |  No Comments

Google

Google

Search Engine Optimisation is key to your site being found on the Internet.

SEO is your way of improving your website’s ranking in Google, Yahoo, and Bing. The more time and possibly money (if employing an external management  company) you spend on SEO, the more chance you will have of being the first search result listed by Google – which is the ultimate goal of any website and to be easily found by your potential customers.

The simplest technique is by altering the text, or ‘content’ on your website. To do this you must first understand your website’s target audience, who are they and what words will they type into Google when they are looking for this particular service or product?  There are of course many possibilities, and it is important to investigate those options and compile a list of your best appropriate keywords and phrases.

Once your keywords and phrases have been researched, your content can then be re-structured effectively so that it is ‘optimized’ and SEO friendly. You can have professional help with this, so talk to an SEO specialist company about what they would recommend. Other important actions include linking, (both internally – from your own site, and externally – other websites providing good linksto yours preferably from PR3 or higher sites) and implementing meta-tags, sub-headings, and website descriptions on all of your web pages. Content is king, so your success will be determined by the quality and relevance of your page content.

Important SEO principles  like ‘Black Hat’ and ‘White Hat’ SEO strategies. These two are very different  and it helps to understand the differences between them before talking to an SEO company about tactics for your website so you can make the right decisions.

‘White Hat’ SEO companies will use or recommend good design, good relevant content and appropriate linking. These will achieve longer lasting results and ranking.

‘Black Hat’ SEO companies, on the other hand will use underhand and inappropriate tactics to get fast results but at the expense of long term strategies and a sustainable website. They will hide bulk keyword text by using a background colour the same as the text so the text doesn’t display, or use font-colours to do the same or very small font sixes so that the text is not readable by humans. This will result in an immediate increase in ranking initially in some cases, but it won’t be very long before the search engines start imposing penalties on those sites and may even remove them from their search results completely.

So if you choose to use an external SEO company, be aware of these two types of SEO companies and ensure you choose the right one.

Keep up to date with what search engines are considering when ranking web site pages and adjust your content accordingly regularly to keep your site high in the rankings.

Also make sure you re-submit your site map to search engines regularly, and every time you make major changes to your site to keep your search results accurate and not link to now non-existing pages or content.

 

 

Slow Internet and Jerky Flash Videos in 7?

May 21, 2014 // Posted in Computer Tips, Main, Tips and Tricks (Tags: , , , , , , , , , , , , , , , , , , , , , , ) |  No Comments

Slow PC?

Slow Internet?

Do you have a DSL or Fiber connection advertised as fast but still getting slow responses in 7 and particularly jerky flash videos?

If the answer is yes, then here is something to try. It worked for me.

First of all check the state of your TCP/IP. To do this open a command prompt at Administrator Level.

To check the current state,

Type at the command prompt:

netsh int tcp show global

and Press Enter

and you will see something like (Save a copy of your details so you can revert to the original settings if required):

origtcp

We need to get that so it reads :

getto

 

So let’s enforce any user-set TCP Window auto-tunning level by typing netsh int tcp set heuristics disabled at the command prompt and press Enter. You should get an OK message.

Next let’s disable the auto-tuning level by typing

netsh int tcp set global autotuninglevel=disabled

at the command prompt and again press Enter. You should once again get an OK message.

Now we will improve the throughput setting by enabling CTCP, type

netsh int tcp set global congestionprovider=ctcp

at the command prompt and press Enter. Check you get an OK message again.

Now we will change the ECN (Explicit Congestion Notification) by typing

netsh int tcp set global ecncapability=default

at the command prompt and press Enter. You should get an OK message again.

Next we will change the receive-side scaling setting by typing

netsh int tcp set global rss=enabled

at the command prompt and press Enter.

Then we set the TCP Chimney Offload: by typing

netsh int tcp set global chimney=enabled

at the command prompt and press Enter.

Finally we set the Direct Cache Access (DCA) by typing

netsh int tcp set global dca=enabled

at the command prompt and press Enter.

Check the new settings by again typing

netsh int tcp show global

and press Enter  and you should now see:

getto

Close the command prompt by typing Exit and press Enter.

It may take a little while for the changes to take effect if you do not re-start your computer.

Here are a few notes on each section should you wish to revert ti your original settings.

Windows Scaling heuristics

Windows 7 has the ability to automatically change its own TCP Window auto-tuning behavior to a more conservative state regardless of any user settings. It is possible for Windows to override the autotuninlevel even after an user sets their custom TCP auto-tuning level.

possible settings are: disabled,enabled,default (sets to the Windows default state)
recommended: disabled (to retain user-set auto-tuning level)

TCP Auto-Tuning

The default auto-tuning level is “normal”, and the possible settings for the above command are:

disabled: uses a fixed value for the tcp receive window. Limits it to 64KB (limited at 65535).
highlyrestricted: allows the receive window to grow beyond its default value, very conservatively
restricted: somewhat restricted growth of the tcp receive window beyond its default value
normal: default value, allows the receive window to grow to accommodate most conditions
experimental: allows the receive window to grow to accommodate extreme scenarios (not recommended, it can degrade performance in common scenarios, only intended for research purposes. It enables RWIN values of over 16 MB)

Compound TCP – Improve throughput
Add-On Congestion Control Provider

The traditional slow-start and congestion avoidance algorithms in TCP help avoid network congestion by gradually increasing the TCP window at the beginning of transfers until the TCP Receive Window boundary is reached, or packet loss occurs. For broadband internet connections that combine high TCP Window with higher latency (high BDP), these algorithms do not increase the TCP windows fast enough to fully utilize the bandwidth of the connection.

Compound TCP (CTCP) is a newer method, available in 7. CTCP increases the TCP send window more aggressively for broadband connections (with large RWIN and BDP). CTCP attempts to maximize throughput by monitoring delay variations and >packet loss. It also ensures that its behavior does not impact other TCP connections negatively.

By default, Windows 7 has CTCP turned off, it is only on by default under Server 2008. Turning this option on can significantly increase throughput and packet loss recovery.

Possible options are:  ctcp, none, default (restores the system default value).

ECN Capability

ECN (Explicit Congestion Notification, RFC 3168) is a mechanism that provides routers with an alternate method of communicating network congestion. It is aimed to decrease retransmissions. In essence, ECN assumes that the cause of any packet loss is router congestion. It allows routers experiencing congestion to mark packets and allow clients to automatically lower their transfer rate to prevent further packet loss. Traditionally, TCP/IP networks signal congestion by dropping packets. When ECN is successfully negotiated, an ECN-aware router may set a bit in the IP header (in the DiffServ field) instead of dropping a packet in order to signal congestion. The receiver echoes the congestion indication to the sender, which must react as though a packet drop were detected.

ECN is disabled by default in 7 and other modern TCP/IP implementations, as it is possible that it may cause problems with some outdated routers that drop packets with the ECN bit set, rather than ignoring the bit. To check whether your router supports ECN, you can use the Microsoft Internet Connectivity Evaluation Tool. The results will be displayed under “Traffic Congestion Test”.
Possible settings are: enabled, disabled, default (restores the state to the system default).
The default state is: disabled
Recommendation: enabled (only for short-lived, interactive connections and HTTP requests with routers that support it, in the presense of congestion/packet loss), disabled otherwise (for pure bulk throughput with large TCP Window, no regular congestion/packet loss, or outdated routers without ECN support).

 

RSS – Receive-side Scaling

The receive-side scaling setting enables parallelized processing of received packets on multiple processors, while avoiding packet reordering. It avoids packet reordering separating packets into “flows”, and using a single processor for processing all the packets for a given flow. Packets are separated into flows by computing a hash value based on specific fields in each packet, and the resulting hash values are used to select a processor for processing the flow. This approach ensures that all packets belonging to a given TCP connection will be queued to the same processor, in the same order that they were received by the network adapter.

Possible rss settings are: disabled, enabled, default (restores rss state to the system default).
Default state is: enabled
Recommended: enabled (if you have 2 or more processor cores and a NIC that can handle RSS)

TCP Chimney Offload

TCP chimney offload enables Windows to offload all TCP processing for a connection to a network adapter. Offloads are initiated on a per-connection basis. Compared to task offload, TCP chimney offload further reduces networking-related CPU overhead, enabling better overall system performance by freeing up CPU time for other tasks.

The possible states are disabled, enabled,  automatic (only Windows 7 and 2008 Server) as follows:
automatic – This default setting is only available under Windows 7 and 2008 Server. It offloads if the connection is 10 GbE, has a RTT < 20ms, and the connection has exchanged at least 130KB of data. The device driver must also have TCP Chimney enabled.
default – this setting restores chimney offload to the system default. Setting this “default” state under Windows 7 and 2008 Server is possible, but it sets the system to the “automatic” mode described above.
disabled – this setting is maually configured as disabled.
enabled – this setting is manually configured as enabled.

I hope you find this useful.

Steve

 

Time to think differently with facebook likes and shares

April 23, 2014 // Posted in General, Main (Tags: , , , , , , , , , , , , , , , , , , , , , ) |  No Comments

#facebook

#facebook

Apart from the obvious visual changes in facebook, some of which, in my opinion have not improved the facebook experience, facebook are changing some of the processes around sharing and liking.

 

If you have an online marketing strategy,you may want to start rethinking how you use Facebook in light of the recent announcement from the company about business spam.

Facebook, like Google, is making changes to its algorithm to ensure users get the best content – most relevant, newest and most original. That bit makes sense.

 

Facebook, however, isn’t just thinking about the content – it also wants to be highly profitable. There philosophy is ‘Why give away free advertising to businesses when you can charge them for it?”

Facebook feels that if they restrict the free visibility they give to businesses, that businesses will pay to get it back.

So what is facebook changing?

1. Asking for Likes, Shares and Comments

 

People and businesses asking for likes, shares and comments in order to promote products is commonplace and expected. Increasing likes, shares and comments, posts appear in many newsfeeds, increasing brand and product visibility – without the company having to pay for it.

Facebook, however, is now calling this common practice,  “like-baiting.” And if you do it, the chances are that your business or website will now be prevented from appearing in users newsfeeds. This means that in future it will be much more difficult to get your ‘organic’ posts to appear in users newsfeeds.

Facebook,  want to crack down on this ‘like-baiting’ practice in order to provide users with, what they say is, ‘a more relevant experience’. After all they want users to find content that matters to them. Otherwise what’s the point and they won’t make any money out of organic posts.

 

2. Frequently Re-Circulating Content

 

In addition to “like-baiting,” Facebook is now making efforts to limit the amount of content that is being re-circulated on News Feeds. Going viral used to be a golden egg search marketers could hope for. Now it holds much less power.

 

Before when content would go viral, it would often happen in waves. Every few months, it would go viral again. Facebook considers this content less relevant to users. According to Facebook, users are complaining about re-circulated content.

 

They’re also complaining about content that isn’t going viral, but is being re-posted by the Pages. Taking the same content and re-posting it will now get you off the News Feeds. Testing so far has shown users are hiding 10% fewer stories from Pages when this update is in place.

 

3. Spamming Links

 

Spamming links come in many forms. Some posts have confusing formatting, and users are “tricked” into clicking on a link. Others say they link to something of relevance, such as a photo album, but when users click on the link, they end up on a website chocked full of ads.

 

How is Facebook determining which links are spammy? They’re tracking the frequency of how often the original post is liked or shared with friends after the links have been clicked. And this can be a problem.

 

If businesses are posting legit content, but it’s just not getting the number of likes or shares it needs, it could end up being flagged by Facebook. What is a business to do? Ask users to like and share their posts – exactly what Facebook is saying they no longer want businesses to do.

So what are you to do to promote your business on facebook?

1. Don’t ask for likes, comments or shares.

2. Don’t re-post items over and over again.

3. Don’t post misleading links.

4. Last resort, pay for advertising.

 

Cloud Computing – A Stupid Question?

January 3, 2014 // Posted in General, Main (Tags: , , , , , , , , , , , , , , , , , , , , ) |  No Comments

incloudsYou will probably think this is a stupid question, but what is the difference between cloud computing and what we already had?

 

I have been trying to get my head around this for some time now. Cloud computing is described as storing your data on a cloud (third party) server on the internet.

But if that is what cloud computing is then isn’t it the same as having your web space on a third party server and storing your data on a database server linked to your web space etc. You can also save all your files etc on that same server.

Obviously a cloud server is not really storing your data in a cloud, it would be very clever if it was. What happens when it rains, does your data come down with it?

Seriously though, if you have a shared hosting package on someone elses servers, or even have your own dedicated server in someone elses building, then your data is stored on a physical device not directly controlled by you but remotely accessed. Isn’t that the same as a Cloud Service? Your data is stored on a physical device in someone elses premises that you manage remotely.

How does one differ from the other? Are they both not the same?

DSCF0141DSCF0138My idea of cloud computing is more like my current method of connecting to the internet. Due to cabling issues my service provider connected me to their data centre some 2-3 miles away via a radio link and in order to be able to see the receiver/transmitter the other end, had to put up a 30ft plus pole, so my modem is ‘in the clouds’, now that’s what I call cloud computing.

 

Comments and answers welcomed

Steve

Is the Internet doomed as we know it?

December 31, 2013 // Posted in General, Main (Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , ) |  No Comments

Computer Crash

Is the Internet breaking apart?

 

The reason I ask this question is based on four important facts.

  • China’s Great Firewall segments the Chinese Internet so that the chinese have great difficulty accessing anything outside China.
  • Russia has now planned legislation so that Russian Internet users cannot access foreign services.
  • In November Germany  said that all communications between the German authorities would be fully enforced to stay within the country.
  • Brazil have also announced plans to create an alternative Internet channel so as not to go through the United States.

The Internet appears to be breaking up into national sectors. In addition, probably driven by the US’s interception and recording of personal transmissions over the Internet (spying in effect), more countries are considering restrictions within their national boundaries.

Countries appear not to want their information and citizens comments to be available outside their own countries any more, and with countries and continents now making restrictions to how the Internet can be used within their countries seems to be adding to the velocity of these actions. So this could be a killer for some businesses that get a lot of their income from sales outside their home country, if they are not allowed access to those customers any more their businesses will surely suffer.

You will be aware of the EU legislation, which initially insisted on explicit acceptance of cookies on EU targeted web sites, later watered down to implied acceptance. This created considerable confusion and concern from countries outside the EU, who wanted to reach the EU customers, but were not sure if they had to comply with the cookie legislation, hell, there was even more confusion within the EU as no one was clear on exactly what was required or expected. There are still many sites that do not comply with the legislation that should, but I have not yet heard of one warning or prosecution by the authorities. So what was that cookie stuff all about?

So over the coming year or years, I think we will see more of the Internet breaking up into national segments.

Then there is some other issues that will affect the Internet going forward.

The number of Cyber Attacks on big financial organisations are increasing and that’s likely to continue, with additional attacks on government organisations rising too.

Hacker’s lives are being made easier too, with more and more sensitive data being committed to the ‘Cloud’ it is without doubt more accessible to the devout hacker. In addition there are  hundreds of staff that are managing this ‘Cloud’ data in individual organisations. How do we know whether one or more of them that has access to this data, is not extracting it any selling it on or using it themselves for personal gain?

IMO it is best to keep data such as this securely in-house where it can be monitored and controlled effectively rather than store it on third party servers, where you really have no control at all, just their word that it is safe.

Phishing is also on the increase and no-one seems to be able to stop it or spamming.

So what is the future of the Internet for 2014?

Will it be anything like it is now in 2015?

 

 

 

HTML5 and CSS3 OR NOT HTML5 and CSS3 Web Developer Dilemma

December 25, 2013 // Posted in Computer Tips, Main (Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , ) |  No Comments

New Formats

New Formats

Dilemma for web site developers with HTML5 and CSS3.

 

As you are no doubt aware, HTML5, CSS3 and now a new version of jQuery that is aimed at HTML5 and CSS3 have been released and are now supported to some extent in the latest versions of major browsers. Also some Web authoring software has also been updated to use these, such as Serif WebPlus X7.

This is all well and good if you know that everyone that will use your web site has a browser that supports the latest HTML5 and CSS3 etc. But how can you determine that, the short answer is “You can’t”. This means that users with older browsers will not see your site as you intended it to look, and many, if you use the new CSS elements to identify form required fields etc, will not be able to use your forms for contact or submitting information. This is even more prevalent when it comes to mobile sites. How many mobile phone users do you know that update their phone software every time a new version is available?  I thought so, almost none! So they will still be using browsers equivalent to IE6 or 7, which have no support for the new html code.

There are also millions of PC users throughout the world that use IE6 and 7 browsers and other browsers that do not support HTML5 etc. So are you going to create a site that cannot be accessed by millions of potential visitors?  If your site is a retail business you could be losing a high percentage of your business.

Even the latest versions of all major browsers do not all support all of the new HTML5 and CSS3 functions, some support some and not other functions, and others support some different functions and not those that others do.

So which HTML5 and CSS3 features do you use? This is a question I cannot answer for you, you will have to make your own decision on this one.

And then there is the old IE thing, IE10 and IE11, although allegedly HTML5 and CSS3 compatible, has some major issues on formatting and extent of compliance, particularly with image compatibility on things like transparent pngs, text formatting, and JavaScript issues etc.

You you need to make some important decisions when you create your new web sites:

  • Do you target your sites at visitors with specific browsers? (IMO not a good idea.)
  • Do you create two sites, one for the new protocols and one that is compatible for the old protocols? (Also probably not a good idea unless the extra work involved is not a problem.)
  • If you choose the above, do you include a script that automatically detects the browser and switches accordingly, or do you ask the user to select a version?
  • Do you stick with the earlier versions of HTML and CSS3 code and stay compatible with most browsers? (Maybe a good choice for the time being until the percentage of users with later versions of browsers increases substantially)

The other issue is the latest jQuery, which has many changed functions and code. I did see a statement that said it was backward compatible with older versions, but trust me when I say that it is not. I have found this out after spending hours trying to find out why things that used to work fine have all of a sudden stopped working.

WebPlus X7, for example now uses the later version of jQuery, and if you have say, a webplus light box on your page the later version of jQuery is then added.

You then add features from jquery-ui 1.7,1.8 etc not adding jquery as WebPlus has already added it for the light box, and the jquery-ui functions no longer work. So you either have to use the later jquery-ui (which doesn’t allow some of the functions of the older versions – bah!) or manually add a different system for light boxes and the earlier jquery manually.

Other third party tools also present a similar issue as many were developed before the new version of jQuery was released.

So if you find that things suddenly do not work when you upgrade your site to HTML5, CSS3 and/or the Latest version of jQuery, perhaps now you will have an idea of where to start looking, and not have to spend hours trying to locate the issues as I have recently.

I hope that this post helps someone in some way.

 

Steve

 

 

 

How to speed up a slow PC

May 24, 2013 // Posted in Computer Tips (Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , ) |  No Comments

Slow PC?

Slow PC?

We have all had the problem that as your PC gets older and you install more and more programs your PC gets slower and slower. So what can you do to speed it up again? Here are five things that you can do:

De-fragment
This goes without saying, it is an essential regular maintenance job, however, I am surprised at how few people regularly do this if you ask them. Do You do this regularly? — Once a month is a good interval. It doesn’t matter which Windows OS you use, make sure you either manually de-fragment or set the machine up to automatically de-fragment at least once a month. When your PC’s files get very fragmented, they suffer serious performance issues. The built-in Windows de-fragmenter works just fine, but if you are looking for something a little better, there are many Free and Commercial products available

Clean up the hard drive
Have you ever filled up a hard drive? If this hard drive contains both your OS and your data files, your machine is going to die! This is often a major cause of slow running PC’s. You need at least 10-15% free on your hard drive for your PC to work, creating temporary working files. If  you haven’t got at least 20% free space, I would recommend you to start a clean up. The built in Windows Disk Clean Up utility quickly clears out all temp files for you, in various categories such as Internet Temporary Files, temporary download files, windows temp directory and more. Access it by right clicking the drive in MyComputer and click Tools > Disk Clean Up. Once you’ve done that, check your pictures, music and videos, as these are usually quite large files. Delete the ones you no longer need or copy them to an external drive or CD/DVD’s. Then check all your document files, delete those no longer needed and back up those you want to archive to another drive or DVD/CD. Once you have sorted those out, you can remove old Restore Points and Shadow copies (from the System Restore Utility) .Then check your installed programs, Do you need them all? There is probably some that you haven’t used for years, get rid of them using the Add/Remove Programs in Control Panel. After you have done all this Empty the re-cycle bin, and check your Drive Space again. Right click the drive in MyComputer and click properties.

Now it’s time to clean up the registry
Errors in the registry cause major slowing down of the PC, and can cause it to stop altogether. Modifying the registry is not something the novice or inexperienced user should attempt on their own, as incorrect changes to the registry can prevent your computer from even starting up. Before you make any modifications to the registry, either manually or using a professional tool, ALWAYS backup the registry first, then you can always restore it if something goes wrong. There are many software tools available, some free and some commercial. Most of them will find many errors in your registry, it’s not always something you’ve done, over time removing programs, upgrades and driver changes will leave remnants in the registry that should not be there. Run a registry fix tool and let it fix the errors it finds. Reboot to confirm your PC still works, (Most registry fixes will make a backup of anything they change, so that you can restore them in safe mode if something goes wrong, but that doesn’t happen very often if you pick a good tool.) After the re-boot run the registry fix again, as some more errors will be found, that do not become apparent until some of the first ones removed have taken effect. Re-boot again and Run the tool again, repeat the process until no errors are found.

Remove spyware/malware
If you are using Windows you MUST have an anti-malware program installed or your machine is guaranteed to get infected with spyware, which will gradually make your PC slower and slower, not to mention the potential privacy and security issues. Malwarebytes is good for this, although most anti-virus packages have anti-malware included, they do not always find all of it. Malwarebytes, is good and there are free and commercial versions available. I have found, that this usually finds anything that my anti-virus package misses (I’ve tried various anti-virus packages and Mawarebytes always finds something that each one of them didn’t). Get free program updates regularly and run it at least once a week. You can even trigger it to start at a specific time and day each week using the Windows Scheduled Tasks facility.

Check the hard drive for errors
After long periods of use the hard drive develops ‘Bad Sectors’, fragments of files get left behind in the indexes and other information in the file descriptors etc. can become corrupt. To fix it, you’ll need to check the drive for errors. Right click the drive in MyComputer and click Properties > Tools > Check Now >  Check the options to Automatically fix file system errors and also to scan and attempt recovery of bad sectors, then click start. If this is your Windows drive, or contains files that windows needs to operate (e.g.Swap file, system files and some drivers), you will be told that Windows needs exclusive access to this drive, so cannot perform the check now, and then asks you if you want to schedule the check when Windows next starts. Click Yes and reboot. The Disk check will start before windows does when you restart. This may take some time as it will check every piece of the drive surface, every file,folder, descriptor table etc. You can also get third party disk checkers, but the built in windows one seems to work just fine.

Short URLs good or bad

May 24, 2013 // Posted in Computer Tips (Tags: , , , , , , , , , , , , , , , , , , , , , , ) |  1 Comment

First of all for those of you that do not know what a shortened URL is, here’s a quick explanation.

A shortened URL allows users to shorten, share, and often track links (web addresses). Reducing the URL length makes sharing easier. For example the shortened link http://bit.ly/steverblog would actually take you to my blog on MyOpera at http://my.opera.com/SteveRiches/blog/, and http://bit.ly/b3hHHs would actually take you to one of my web sites http://www.richosoft.co.uk/.

Shortened URL’s can be obtained for FREE from places like http://bit.ly, and others. Some like bit.ly, will also offer tracking statistics on your shortened URL’s. This allows you to see what site the visitor was referred by, how many people have clicked this URL, where they came from, what browser they were using and much more.

OK, so I have have a shortened URL for my page. What use is it to me or my friends, colleagues or business contacts?

  •     When Tweeting on Twitter, I can save characters in my tweet by reducing the length of a link to my website that I wish to include, but the viewer can still reach the long link address in one click.
  •     When sending an email and including my web address, I can shorten any link to my website, making the email less cluttered, and if it is a very long web address, avoid the viewer having to cut and paste the address because it wouldn’t fit on one line in the email.
  •     I can get extensive tracking statistics. (see above)
  •     I don’t have to type long web addresses into emails, tweets and facebook posts, reducing the possibility of typing errors.
  •     Links look tidier.

So much for the benefits what are the possible problems?

When you click the link http://bit.ly/b3hHHs you do not know where it is going to take you, whereas when you see the link http://www.richosoft.co.uk/, you can be pretty certain you are going to the RichoSoft site.
http://bit.ly/b3hHHs could be taking you to a porn site, phishing site, malware distribution site or some other un-desirable site.

ADVICE
My advice to you is to only click shortened links if you are confident that the source is genuine and the link valid. Some big companies use shortened links including companies like The BBC, Opera and big retail organisations, and the links they may send you, or post on twitter can usually be trusted.

Also whatever link you see in an email or internet page, be aware that you might not actually be going to the site shown, eg: Click this link: http://www.bbc.co.uk

Where did it take you?    Did you see the BBC home page?   You now see what I mean?

In most browsers today, you can preview where the link is really taking you by first hovering over the link and checking out the status bar,  some email clients do the same or similar. So before clicking a link check it out first.

How to remember many Strong Passwords

May 24, 2013 // Posted in Computer Tips (Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ) |  No Comments

We all use the Internet more and more, and now regularly visit many sites that require a password.

Strong Passwords are Good

Strong Passwords are Good

 

But how do you remember so many different passwords as it is best to set a different password for each site?
You can do this effectively by:

 

 

  1. Write just part of your password down, or store in a file, or on your mobile phone. But what if someone steals my phone or computer or finds my paper? That is OK, because you are only writing down part of your password, and they will not know what site it refers to, as we will see in a moment.
  2. Create STRONG passwords with letters, numbers and non-alphanumeric characters, and keep them as long as possible. Mix upper and lower case characters. This way they cannot be guessed. Never use your date of birth, or house number/post code or name in your password, as people can get these off the Internet in places like Facebook etc.
  3. Now you need to create your passwords:
  • Think of a PIN that you will always remember, 3 ,4 or 5 characters long, something like  3478# or 8#7 or 23&4.
  • For each web site you need a password for create a code that helps you remember what site it is for e.g.,  FBk for facebook, RSoF for RichoSoft, TWit for Twitter etc.Next add some random characters  e.g., 4556, or zc98@.  Use different random characters for each password.

    These you write down, save in a file or store in your phone. Anyone finding these would not know what they refer to, and the password is not complete anyway, so would not work.

You now have passwords that look like this:

FBk298745+C
RSoFhgTf89%F
TWitBV65W_u

 

  • Next we use your PIN. Decide whether your PIN will be at the beginning or end of your password, and when using the passwords add  your PIN, to this position. So if we had a pin 3478#  our passwords above  would become:
FBk298745+C3478#
RSoFhgTf89%F3478#
TWitBV65W_u3478#
OR
3478#FBk298745+C
3478#RSoFhgTf89%F
3478#TWitBV65W_u
  • These full passwords are the ones to use on the sites, and now we have STRONG passwords that cannot possibly be guessed, and you only need to remember the PIN Part and whether it is at the front or back. The rest is written down or saved  so you do not forget it. By having a different password for each site, if someone does actually find out one of your passwords the others are totally safe.

Some background

April 13, 2013 // Posted in Main (Tags: , , , , , , , , , , , ) |  1 Comment

 

For the first main post I thought I would give you some background on how I got into creating Software, Web Sites and Web Applications so here goes…

I was born at a very young age, and whilst I was at Grammar School got a Saturday and Holidays job at Currys, the Electrical People.

A Currys Shop

A Currys Shop

Morris-1000-Van-Currys

Morris-1000-Van-Currys

It was here that I got my first taste of technology, although it was very different to what we have today. The closest thing we had to a personal computer was Prestel from the Post Office. This involved a TV and a telephone line, and the data retrieved from the telephone was displayed on the TV, but it was only text, no images similar to Teletext.

Prestel Dr Alex Reid

Prestel Dr Alex Reid

Viewdata (Prestel) Ad

Viewdata (Prestel) Ad

When I left school I joined Currys full-time as a Trainee Manager, and it wasn’t long before I got my first store in a small market town called Diss, which is also where I went to Grammar School. The nearest we got to a computer at that time, was the Company Mainframe. All stock came with a stock docket, that was a machine readable piece of paper. When you sold the product you marked a line in a little box with an HB pencil and sent it off to Head Office. This along with millions of others, was then fed into a hopper and read by the mainframe and your stock was adjusted. A few days later you received a dot-matrix printed report detailing what had been processed.

I was promoted to Thetford, and this is where I got my first taste of a ‘personal’ computer. The Sinclair Z80 and Commodore VIC20 both arrived on the scene at around the same time. Both had memory of 2Kb. The Z80 was available only in kit form, and you had to build it yourself.

Sinclair ZX80

Sinclair ZX80

Sinclair ZX81

Sinclair ZX81

Commodore16/Vic20
Commodore16/Vic20

Later Sinclair launched the ZX81, which was available in kit or ready built form. You could add a 16Kb memory pack onto the rear connector to increase the ‘power’ of the macine, but if you moved the computer the connection to the ram pack inevitably came apart and everything you had done was lost. Saving programs and data on both the VIC20 and Sinclairs was done by saving the data to a standard cassette tape recorder, which was neither fast nor reliable. However, this is where I got my first programming experience. The Sinclair’s ‘Basic’ was not typed in as characters, but as key programming words, so if you wanted to use the PRINT command, you couldn’t type in P,R,I,N,T, whilst in programming mode, you pressed P, and the command PRINT appeared on the screen. You also had to have a TV to view the output from these computers.

The Apple IIe, the BBC and Acorn computers also appeared and they used 5¼” floppy diskettes. These too had limited memory and still no graphics.

BBC Computer

BBC Computer

Apple IIe

Apple IIe

Acorn Electron

Acorn Electron

I did some ‘real’ programming on an Apple IIe that my District Manager had been given, and wrote my first program that used data and stored it as separate data files from the users entries.

Later Amstrad started to devlop computers, with the Amstrad 64 and 6128, and later the 1512, 1640, PCW, PCW512 and then the PC2086, and PC2286.

Amstrad 64

Amstrad 64

Amstrad 6128

Amstrad 6128

Amstrad 1512/1640

Amstrad 1512/1640

Amstrad 2086

Amstrad 2086

The 64, 6128 and PCW range used CPM instead of Basic and DOS, It was similar to program as basic and I soon started writing applications for CPM. I wrote some applicatons for use by Area Managers at the Divisional Office and they were vey successful.

I was promoted to Divisional Office and soon started using the Amstrad 1512 with it’s dual floppy 5¼” disks and started learning Microsoft Basic. Graphical displays could now be achieved. But it wasn’t until I got into the Amstrad 2086, with it’s 3″ floppy diskettes (with hard plastic cover) and it’s 20Mb hard drive, that I got my first taste of Windows. (3.1). Although you could have several things open in ‘Windows’ at the same time, only one could actually run at a time.

Then things started to develop in leaps and bounds, first bulletin boards accessed via modems (300bps, and 1200bps/75bps) and then the Internet at speeds of up to 14.4Kbps, later 28.8Kbps and even 56kbps. This is when I created my first web site, everything was manually coded then, there wasn’t any fancy Web Authoring Software then and even images were limited as was what you could do with them. There wasn’t even javascript then either.

Over the next few years I changed jobs in the Currys/Dixons/PCWorld group and became more involved with writing applications in Paradox, Access and the like and also devloping web applications in ASP.

In 2006 I was made redundant, and later the same year took early retirement. This gave me plenty of time to spend on my hobby (computers) and to develop more software, web sites and web applications.

So there you have it, I have seen the advancment of the technology, from it’s infancy to the current day and like to think that I have developed and kept up with it too, at least a little anyway. And you know, all of the principles I learnt all those years ago, are still so relevant today, some may have been forgotten or not even learnt by some of the newer programmers now on the scene, but I will never stop remembering or using them where they are appropriate.

 

 

 

%d bloggers like this: