Senin, 08 Oktober 2012

[smf_addin] Digest Number 2366

13 New Messages

Digest #2366
1b
Re: Screen scrape data to plot intraday chart by "Randy Harmelink" rharmelink
1d
Re: Screen scrape data to plot intraday chart by "Randy Harmelink" rharmelink
1e
Re: Screen scrape data to plot intraday chart by "metastkuser100" metastkuser100
1f
Re: Screen scrape data to plot intraday chart by "Randy Harmelink" rharmelink
1h
Re: Screen scrape data to plot intraday chart by "Randy Harmelink" rharmelink
3a
Google Docs Integration by "westes2" westes2
3b
Re: Google Docs Integration by "Randy Harmelink" rharmelink
4a
Re: Welcome back Randy!!! by "mdediegop" mdediegop

Messages

Sun Oct 7, 2012 6:20 pm (PDT) . Posted by:

"mhreyn" mhreyn

Hi,

I would be interested in learning/contributing to this effort using Python. Let me know if you are interested.

--- In smf_addin@yahoogroups.com, "freebelvin" <freebelvin@...> wrote:
>
> Greetings to all via my first post to this group.
>
> I recently joined this group because you all do some things I wanted to learn. I've been studying the literature on the site and I'm very impressed with the level of technology, and wish to express my thanks to Randy and others for developing these tools and generously sharing them with others.
>
> It turns out that I'm an engineer and have a few tricks too. I thought I might take this opportunity to describe what I've been doing, and the question in this post is closely related to my approach. First let me say that I do not in any way put this out as any form of technology competition! I only bring this up as a way of expanding people's awareness of what is possible.
>
> What I have done is to use the same URL linking and fetching mechanisms embedded inside the excel macros, but I have embedded them instead into python scripts. This leads to much more programming effort, but also provides much more flexibility in handling the data. In the example below, a script could be coded with the website URL and fields to extract and fetch the data on a schedule. The data could then be printed to the screen, or additional algorithms could be attached to generate alerts, perform follow-on calculations, etc. The possibilities are broad, but I admit, this does require a level of software skill. And I will comment that as a C programmer, learning python syntax has been quite an adventure for me.
>
> OK, that's it. If anyone wishes to know more about this approach to processing web-based market data, please let me know and I can follow up. Again, please consider this to be an augmentation of the technology already existing in this group, and not as the beginning of any type of competition!
>
>
> belvin
>
>
>
> --- In smf_addin@yahoogroups.com, "metastkuser100" <metastkuser100@> wrote:
> >
> > I would like to:
> >
> > a) Screen scrape two numbers from a website, every 15 minutes, which then populate a table in Excel, throughout market hours, M-F.
> >
> > b) Then plot an Excel chart based on the data in the table. As the day progresses, the table and the chart gets automatically updated every 15 minutes.
> >
> > This is possibly way beyond my abilities, but I'm willing to give a real good try!
> >
> > Can the first part be done by SMF plugin?
> >
>

Sun Oct 7, 2012 6:22 pm (PDT) . Posted by:

"Randy Harmelink" rharmelink

Further discussions of Python programming to get data into EXCEL would be
more appropriate for this Yahoo group:

http://tech.groups.yahoo.com/group/xltraders/

On Sun, Oct 7, 2012 at 6:15 PM, mhreyn <mhreyn@yahoo.com> wrote:

>
> I would be interested in learning/contributing to this effort using
> Python. Let me know if you are interested.
>

Sun Oct 7, 2012 6:57 pm (PDT) . Posted by:

"MS" metastkuser100

Hi Randy,

I am trying to extract the top row of numbers (1 Month Highs) from this website: http://www.barchart.com/stocks/newhilo.php?dwm=

The following command is working great, and I'm getting 760 as expected:

=RCHGetTableCell("http://www.barchart.com/stocks/newhilo.php?dwm=",1,"1-Month Highs")

However, the following command is not working, and I'm getting 10_T&dwm=">583

=RCHGetTableCell("http://www.barchart.com/stocks/newhilo.php?dwm=",8,"1-Month Highs")

Very odd because all I have done is incremented the "1" to "8". Would you please help?

Thanks Much!!!

--- In smf_addin@yahoogroups.com, "metastkuser100" <metastkuser100@...> wrote:
>
> I would like to:
>
> a) Screen scrape two numbers from a website, every 15 minutes, which then populate a table in Excel, throughout market hours, M-F.
>
> b) Then plot an Excel chart based on the data in the table. As the day progresses, the table and the chart gets automatically updated every 15 minutes.
>
> This is possibly way beyond my abilities, but I'm willing to give a real good try!
>
> Can the first part be done by SMF plugin?
>

Sun Oct 7, 2012 8:42 pm (PDT) . Posted by:

"Randy Harmelink" rharmelink

The problem is that they have a ">" in a URL within an HTML tag. The add-in
only does rudimentary HTML parsing, and that little bugger ain't part of
what is accounted for.

Since the issue would be consistent, a work-around for that cell could be:

=0+smfStrExtr(RCHGetTableCell("
http://www.barchart.com/stocks/newhilo.php?dwm=",8,"1-Month
Highs")&"|",">","|")

On Sun, Oct 7, 2012 at 6:57 PM, MS <metastkuser100@yahoo.com> wrote:

>
> I am trying to extract the top row of numbers (1 Month Highs) from this
> website: http://www.barchart.com/stocks/newhilo.php?dwm=
>
> The following command is working great, and I'm getting 760 as expected:
>
> =RCHGetTableCell("http://www.barchart.com/stocks/newhilo.php?dwm=",1,"1-Month
> Highs")
>
> However, the following command is not working, and I'm getting
> 10_T&dwm=">583
>
> =RCHGetTableCell("http://www.barchart.com/stocks/newhilo.php?dwm=",8,"1-Month
> Highs")
>
> Very odd because all I have done is incremented the "1" to "8". Would you
> please help?
>

Sun Oct 7, 2012 9:01 pm (PDT) . Posted by:

"metastkuser100" metastkuser100

Randy, wonder if you check that. I tried it (and several variations) and it giving me #NAME? instead of 583

Thanks

--- In smf_addin@yahoogroups.com, Randy Harmelink <rharmelink@...> wrote:
>
> The problem is that they have a ">" in a URL within an HTML tag. The add-in
> only does rudimentary HTML parsing, and that little bugger ain't part of
> what is accounted for.
>
> Since the issue would be consistent, a work-around for that cell could be:
>
> =0+smfStrExtr(RCHGetTableCell("
> http://www.barchart.com/stocks/newhilo.php?dwm=",8,"1-Month
> Highs")&"|",">","|")
>
> On Sun, Oct 7, 2012 at 6:57 PM, MS <metastkuser100@...> wrote:
>
> >
> > I am trying to extract the top row of numbers (1 Month Highs) from this
> > website: http://www.barchart.com/stocks/newhilo.php?dwm=
> >
> > The following command is working great, and I'm getting 760 as expected:
> >
> > =RCHGetTableCell("http://www.barchart.com/stocks/newhilo.php?dwm=",1,"1-Month
> > Highs")
> >
> > However, the following command is not working, and I'm getting
> > 10_T&dwm=">583
> >
> > =RCHGetTableCell("http://www.barchart.com/stocks/newhilo.php?dwm=",8,"1-Month
> > Highs")
> >
> > Very odd because all I have done is incremented the "1" to "8". Would you
> > please help?
> >
>

Sun Oct 7, 2012 9:12 pm (PDT) . Posted by:

"Randy Harmelink" rharmelink

Copy and paste and it worked fine here...

Do you have an old version of the add-in that doesn't have the smfStrExtr()
function? What item in the formula is triggering the #NAME? error?

On Sun, Oct 7, 2012 at 9:01 PM, metastkuser100 <metastkuser100@yahoo.com>wrote:

> Randy, wonder if you check that. I tried it (and several variations) and
> it giving me #NAME? instead of 583
>
> Thanks
>
> --- In smf_addin@yahoogroups.com, Randy Harmelink <rharmelink@...> wrote:
> >
> > The problem is that they have a ">" in a URL within an HTML tag. The
> add-in
> > only does rudimentary HTML parsing, and that little bugger ain't part of
> > what is accounted for.
> >
> > Since the issue would be consistent, a work-around for that cell could
> be:
> >
> > =0+smfStrExtr(RCHGetTableCell("
> > http://www.barchart.com/stocks/newhilo.php?dwm=",8,"1-Month
> > Highs")&"|",">","|")
>

Sun Oct 7, 2012 10:34 pm (PDT) . Posted by:

"MS" metastkuser100

I downloaded the latest SMF addin and it works! However, the number in the cell is a live clickable link (and blue, underlined). Is this expected?

--- In smf_addin@yahoogroups.com, Randy Harmelink <rharmelink@...> wrote:
>
> Copy and paste and it worked fine here...
>
> Do you have an old version of the add-in that doesn't have the smfStrExtr()
> function? What item in the formula is triggering the #NAME? error?
>
> On Sun, Oct 7, 2012 at 9:01 PM, metastkuser100 <metastkuser100@...>wrote:
>
> > Randy, wonder if you check that. I tried it (and several variations) and
> > it giving me #NAME? instead of 583
> >
> > Thanks
> >
> > --- In smf_addin@yahoogroups.com, Randy Harmelink <rharmelink@> wrote:
> > >
> > > The problem is that they have a ">" in a URL within an HTML tag. The
> > add-in
> > > only does rudimentary HTML parsing, and that little bugger ain't part of
> > > what is accounted for.
> > >
> > > Since the issue would be consistent, a work-around for that cell could
> > be:
> > >
> > > =0+smfStrExtr(RCHGetTableCell("
> > > http://www.barchart.com/stocks/newhilo.php?dwm=",8,"1-Month
> > > Highs")&"|",">","|")
> >
>

Sun Oct 7, 2012 10:40 pm (PDT) . Posted by:

"Randy Harmelink" rharmelink

Definitely not.

However, that would be controlled by formatting, not by an add-in function.
A function can only return a value to a cell, not format it.

On Sun, Oct 7, 2012 at 10:34 PM, MS <metastkuser100@yahoo.com> wrote:

> I downloaded the latest SMF addin and it works! However, the number in the
> cell is a live clickable link (and blue, underlined). Is this expected?
>

Sun Oct 7, 2012 10:24 pm (PDT) . Posted by:

"westes2" westes2

Is there any documentation for how you can retrieve an arbitrary line on the balance sheet or income statement on a particular data source web site?

For example, I would like to retrieve tangible book value from Yahoo.

Or from the Yahoo balance sheet I would like to recover the preferred shares entry under the Equity section of the balance sheet.

Sun Oct 7, 2012 10:39 pm (PDT) . Posted by:

"Randy Harmelink" rharmelink

Check out the documentation for the RCHGetTableCell() function in the
DOCUMENTATION folder in the FILES area of the group. For example, to get
the most recent # of preferred shares for WFC from Yahoo:

=RCHGetTableCell("http://finance.yahoo.com/q/bs?s=WFC",1,">Preferred Stock")

On Sun, Oct 7, 2012 at 10:22 PM, westes2 <westes@earthbroadcast.com> wrote:

> Is there any documentation for how you can retrieve an arbitrary line on
> the balance sheet or income statement on a particular data source web site?
>
> For example, I would like to retrieve tangible book value from Yahoo.
>
> Or from the Yahoo balance sheet I would like to recover the preferred
> shares entry under the Equity section of the balance sheet.
>

Sun Oct 7, 2012 10:24 pm (PDT) . Posted by:

"westes2" westes2

I don't suppose the author would consider implementing these functions for use inside of Google Spreadsheets? It would be enormously powerful to build financial models that retrieved updated data and could be shared with the world through the cloud.

Sun Oct 7, 2012 10:32 pm (PDT) . Posted by:

"Randy Harmelink" rharmelink

Since Google doesn't utilize VBA, it wouldn't be possible.

However, Google has its own version of such extractions, using the
GoogleFinance(), ImportData(), ImportHTML(), and ImportXML() functions.

On Sun, Oct 7, 2012 at 10:24 PM, westes2 <westes@earthbroadcast.com> wrote:

> I don't suppose the author would consider implementing these functions for
> use inside of Google Spreadsheets? It would be enormously powerful to
> build financial models that retrieved updated data and could be shared with
> the world through the cloud.
>

Mon Oct 8, 2012 12:32 am (PDT) . Posted by:

"mdediegop" mdediegop

I didn't know about your condition, I hope you get well in the least painful way. Recently I cutted myself a tendon in my hand, I know how painful rehab can be...

--- In smf_addin@yahoogroups.com, Randy Harmelink <rharmelink@...> wrote:
>
> Appreciate it. I was in the hospital for 4 days and then rehab for 17 days,
> because of a continuing leg/knee problem. Still not solved, and won't be
> unless I lose a lot of weight, but at least I can transfer well enough to
> get around.
>
> On Sat, Oct 6, 2012 at 2:21 PM, Ron Spruell <hashky@...> wrote:
>
> >
> > I didn't know how long you were going to be AWOL, so I started trying to
> > answer some of the questions that I thought I knew the answers.
> >
>

Tidak ada komentar:

Posting Komentar